Apple
AppleApple has implemented new restrictions on the usage of external artificial intelligence (AI) tools, including ChatGPT, by its employees, according to a report by the Wall Street Journal. Citing a document and insider sources, the report reveals that Apple is developing its own similar AI technology and is concerned about the potential leakage of confidential data by employees utilising external AI programs.
Additionally, Apple has reportedly advised its employees against using Copilot, a software code automation tool owned by Microsoft and hosted on GitHub. The company's cautionary stance comes amidst worries surrounding the unauthorised disclosure of sensitive information.
In recent times, OpenAI, the organisation behind ChatGPT, announced the introduction of an "incognito mode" for ChatGPT. This mode ensures that users' conversation history is not stored or utilised to enhance the AI model's capabilities. The move was made in response to mounting scrutiny over how AI systems like ChatGPT and their derivative chatbots handle and utilise vast amounts of user data for training and improvement purposes.
Also read: 'Twitter 2.0': Elon Musk's Twitter CEO pick Linda Yaccarino on future of the platform
Coinciding with this development, OpenAI launched the ChatGPT app for Apple's iOS platform in the United States earlier today.
As Apple continues to develop its own AI technology, these restrictions on external AI tools are likely aimed at protecting intellectual property and maintaining control over data usage. Meanwhile, other companies like Samsung, JP Morgan Chase, Deutsche Bank, and Amazon have also restricted the use of ChatGPT by their employees.
Also Read
'Buying Netflix at $4 billion would've been better instead of...': Former Yahoo CEO Marissa Mayer
ChatGPT beats top investment funds in stock-picking experiment
For Unparalleled coverage of India's Businesses and Economy – Subscribe to Business Today Magazine