In recent years, there has been a significant increase in the number of AI-capable laptops on the market, purchased by students and workers in the US, driven by the growing need to perform AI tasks directly on the computer. It is also becoming common to see companies like Microsoft and their hardware partners working together to create a new generation of Windows-based computers that enable users to perform AI processing without relying on external cloud services and deliver faster performance.  

This is all happening because of the introduction of new types of chips called neural processing units (NPUs), which have processing capabilities of 40-50 trillion operations per second (TOPS). These units enable the execution of AI applications entirely on the laptop itself, rather than on remote cloud servers. As such, we are seeing a shift in modern personal computers from AI features being “add-on” enhancements to an integral part of the computer itself.  

The Rise of On-Device AI Computing  

For many years, typical AI workloads have been processed in the cloud due to the very high computational demands of different AI tasks. However, advances in chip design enable laptops to increasingly run artificial intelligence workloads locally, improving performance and reducing latency. Local AI applications can process data in real time without an active internet connection, which is especially beneficial for applications that deliver immediate results, such as speech recognition, image analysis, and document analysis. Microsoft has been proactively endorsing this paradigm shift through its Windows ecosystem by encouraging developers to design & develop their applications optimized for local AI processing.  

Understanding NPUs and Their Role  

Specialized chips known as neural processing units have been developed to speed up computations in artificial intelligence (AI). NPUs are designed for much more effective execution of parallel processing tasks typically associated with machine learning models than standard central processing units (CPUs) or graphics processing units (GPUs) are.  

With proposed performance levels of 40–50 TOPS+, NPUs can execute complex AI workloads (such as real-time translation, noise suppression, and photo editing) on laptops.  

The performance of the previously mentioned workloads will also be significantly improved due to NPUs’ ability to execute AI tasks with much less energy than traditional processors/CPUs.  

Changing Workflows for Students and Professionals  

Laptops equipped with AI capabilities have changed the way students and professionals do their daily work. Students can use AI tools to support their research, writing, and data analysis. As a result, student learning has become more interactive and efficient.  

Those working in professional fields like content creation, software development, and data analysis will notice that working faster and automating repetitive tasks are now possible as well. Examples include AI-assisted coding, AI-assisted video editing, and AI-assisted document summarization; these types of features are quickly becoming a normal part of life.  

AI-capable devices are being integrated with Microsoft applications and other products so they can be included as an ongoing component of a user’s productivity workflow.  

Privacy and Offline Capabilities  

One major advantage of AI systems designed for use directly on devices, such as smart speakers or smartphones, is that they can offer greater privacy for their users. When data is processed and generated locally by an AI system, there’s no risk of sending it to an external server, greatly reducing the likelihood of user data being compromised. 

This security measure is needed because professionals handle sensitive data, including legal documents, financial information, and personal documents.  

The system provides offline capabilities, enabling users to use AI functions without an internet connection. The AI-ready laptops deliver enhanced versatility and operational reliability by performing effectively across diverse work settings.  

Growing Ecosystem of AI Applications  

There has been an increase in the availability of AI-enabled hardware for businesses to use to create additional AI-powered software applications. The developers are developing additional software applications that use NPUs (neural processing units) to complete tasks, including, but not limited to, real-time collaboration, predictive analytics, and personalized user experience. 

Microsoft is a key component of this ecosystem because it offers development tools and frameworks that simplify the creation of AI-powered applications for Windows devices.  

The growing number of applications that now support on-device AI capabilities will drive up the market value of AI-ready laptops.  

Competition in the AI Laptop Market  

As many manufacturers release AI-enabled laptops and those with advanced NPUs, competition in the AI-ready laptop market is rapidly intensifying. Manufacturers are now focusing on product differentiation based primarily on the following three key factors: performance, battery life/maintenance, and compatibility with software ecosystems. 

The competition drives rapid technological progress, leading to devices with better performance and efficiency than existing products. The result of this development is that AI-ready laptops now reach a broader audience of potential users.  

Microsoft uses its ecosystem strategy to establish a dominant position in a market undergoing transformation.  

Challenges in Adoption  

AI-ready laptops have challenges that prevent their adoption despite their benefits. Advanced NPU devices cost more than standard laptops, creating a significant cost problem.  

AI-powered features require users to learn new skills before they can use them effectively. Users need to learn new work methods and master tool usage to achieve maximum work efficiency.  

Software optimization remains a developing process because not all applications currently support NPU technology.  

Future of Personal Computing  

The use of AI for the first time in a laptop is a significant turning point for personal computing and artificial intelligence. Therefore, AI will be incorporated by default into every device that uses a laptop as its foundation. 

Future laptops will include advanced AI capabilities, which enable personalized assistants to learn user behavior, predictive performance optimization, and improved collaboration tools. Microsoft positions its platform as the market leader for this transition because it aims to make AI technology practical and accessible for daily use.  

Conclusion: AI Becomes a Core Computing Feature  

The rise in students’ and professionals’ use of computers equipped with Artificial Intelligence (AI) represents a fundamental shift in how technology is used in the workplace and the classroom. AI is an added benefit that provides quicker processing times, more secure data storage, and greater flexibility, as it can process information directly from your computer or laptop. 

The ongoing development of the ecosystem will bring AI into personal computing systems and create new digital experiences for users.

Source: Your productivity, supercharged 

Highlights 

  • Qualcomm and Snap have signed a multi-year strategic agreement deepening their 10-year partnership to accelerate innovation in wearable tech.  
  • The agreement will bring Snapdragon XR solutions to future versions of Specs.  
  • The collaboration will give developers and customers a strong foundation to create smarter experiences on eyewear.  

Qualcomm Technologies Inc and Specs Inc, a Snap subsidiary, announced a strategic agreement to use Qualcomm’s Snapdragon system-on-chip in future Specs generations.  

Powering The Next Generation Of Eyewear 

This is Specs Inc’s first major project as it prepares to launch its advanced eyewear, Specs (Snap’s new see-through AR glasses, building on but not the same as Spectacles), blending digital experiences with the real world later this year. Specs are standalone, see-through glasses that enable users to see, hear, and interact with digital content as if it were part of their surroundings.  

Specs use Snapdragon XR platforms. These platforms combine edge AI with high performance and low power consumption, enabling intelligent, context-aware experiences to run directly on the device. This means faster and more private interactions. The initiative embodies both companies’ goals to make computing more human and more smoothly embedded in daily life, changing how people work, learn, and play together.  

Building on a Decade-Long Relationship 

Snap and Qualcomm Technologies have a history of innovation, with Snapdragon platforms powering earlier versions of Snap’s Spectacles (Snap’s previous AR glasses). This new agreement focuses on Specs, their new eyewear product, to further work in immersive technology.  

With coordinated planning and close collaboration, both companies aim to quickly add features to Specs such as on-device AI, advanced graphics, and multi-user digital experiences. We believe the future of computing will be more human and grounded in the real world, said Evan Spiegel, co-founder and CEO, Snap Inc. Our work with Qualcomm Technologies provides a foundation for the specs we deliver, delivering advanced technology and performance for developers and consumers.  

The next era of computing will be defined by devices that understand what you see, hear, and say, as well as context, and respond instantly to the world around you, said Cristiano Amon, president and chief executive officer, Qualcomm Incorporated. Our work on future generations of specs will enable power-efficient interactive AR devices that deliver agentic experiences that feel natural and intuitive, and integrate seamlessly with daily life.  

Qualcomm is committed to enabling intelligent computing everywhere and addressing global challenges. With over 40 years of technology leadership, we deliver solutions powered by AI and strong connectivity. Snapdragon platforms offer great consumer experiences, and our Dragon Wing products help industries grow. Working with partners, we drive digital transformation to improve lives, businesses, and society.  

Qualcomm Incorporated includes our licensing business, QTL, and the vast majority of our patent portfolio. Qualcomm Technologies Inc., a subsidiary of Qualcomm Incorporated, operates, along with its subsidiaries, substantially all of our engineering and research and development functions and substantially all of our product and service businesses, including our QCT semiconductor business. Snapdragon and Qualcomm-branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries. Qualcomm patents are licensed by Qualcomm Incorporated.  

About Specs Inc 

Specs Inc., a Snap subsidiary, is dedicated to more human computing by creating specs eyewear that blends digital experiences with the real world.  

Specs have see-through lenses for placing digital objects in 3D space and use Snap OS for natural interaction.  

Specs Inc. also offers Lens Studio, a suite for developers to build immersive AR experiences for Specs, Snapchat, and other platforms. 

SourceQualcomm and Snap Expand Strategic Collaboration to Advance Intelligent Computing Experiences on Specs 

The new chip design will completely change artificial intelligence by delivering 100 times better performance through its architecture, which operates independently of cloud services. The research combines memristor-based in-memory computing with secure processing methods to enable AI models to execute directly on devices while consuming minimal power. The advancement enables smartphones, Internet of Things devices, and edge systems to execute advanced artificial intelligence tasks locally, improving processing speed, user data protection, and environmentally friendly operation.  

Redefining AI Hardware  

In a typical AI environment, vast amounts of cloud computing power are required to handle the immense volume of data generated by AI applications. This introduces latency issues due to long-distance data movement, as well as high energy consumption and privacy concerns when moving such sensitive information across networks. The memristor-based chips provide a different architecture that combines computation and memory into a single integrated unit.  

Because this new architecture eliminates the need to move large amounts of data back and forth between RAM and the CPU and GPU for AI computation, it addresses a key bottleneck in AI implementations. Additionally, the memristor-enabled chip enables AI algorithms to be executed directly on the device, creating opportunities for new real-time AI applications across robotics, autonomous vehicles, wearables, smart sensors, and more.  

Energy Efficiency and Sustainability  

A major benefit of the new chip is its energy efficiency. In general, AI processing consumes a lot of electrical power, mostly from massive data centres. moves much less, and memristor switching consumes very little energy.  

The lower electrical cost of AI processing, enabled by energy-efficient chips, also means a lower carbon footprint. With rapid growth in AI use across sectors, energy-efficient hardware solutions will be key to ensuring large-scale AI implementations are environmentally sustainable.  

On-Device AI and Privacy  

By enabling local IoT device operation, the chip also helps address growing concerns about data privacy. All types of sensitive information (e.g., personal health data, financial transaction information, and proprietary business information) can be processed on the device without being transmitted off the device.  

In addition, on-device processing reduces response latency across all the examples above; therefore, these AI models can provide real-time responses. This level of capability is crucial for scenarios like autonomous navigation, real-time translation, and augmented/virtual reality (VR/AR), where speed and immediacy are critical to the user experience and operational dependability.  

Memristor-Based In-Memory Computing  

At the core of this discovery exists memristor technology. Memristors are memory devices that store data and enable simultaneous information processing. The system performs computations at the location of data storage because it can process information without using a standard CPU-GPU architecture that separates memory and processing tasks.  

The chip uses multiple memristors, which the system organises into arrays that can perform AI computations simultaneously. The system uses parallel processing to manage extensive neural networks, thereby improving performance without increasing energy consumption or physical dimensions.  

Security and Trust  

The microprocessor’s performance is enhanced by multiple security features. It executes calculations on a physically secure medium, reducing exposure to external threats and data loss. This form of securing a design for AI applications will be crucial due to their ability to provide value in sensitive sectors such as healthcare, banking, the military, and autonomous systems.  

Mixing AI high-performance processing capabilities with superior security on a single chip represents a significant advancement and will ultimately provide users (businesses or end users) with the most complete AI solution.  

Implications for Edge Computing  

The new technology is likely to accelerate the adoption of edge computing by bringing AI functionality closer to where data is generated and collected, thereby moving away from using cloud servers for computation and instead having edge applications perform computations locally. Therefore, when computing locally via edge computing, edge applications can provide quicker response times than cloud computing, with increased reliability and substantially lower operating costs.  

Manufacturers, logistics companies, smart cities, and autonomous systems will all benefit from this breakthrough technology. Edge computing will enable real-time analytics, predictive maintenance, and adaptive control systems to operate more efficiently than continuously relying on the cloud for computing.  

Transforming AI Applications  

With this new chip, more complicated AI models can be run on smaller and more portable devices and allow developers to implement complex neural networks into various apps directly on board devices, providing for a much larger pool of potential applications, such as for computer vision, natural language processing, and reinforcement learning by way of hosted or offline processing capabilities.  

By providing access and capabilities to small and mid-sized companies, start-ups, and researchers to innovate with on-device integrations, AI will be democratised, offering smaller organisations greater accessibility without the perceived need to deploy large amounts of infrastructure.  

Competitive Advantage in AI Hardware  

The increasing demand for AI worldwide has made hardware efficiency and speed key competitive advantages. AI acceleration continues to be an area of ongoing investment from major companies, including NVIDIA, Intel, and others; however, the introduction of a memristor-based chip offers a fundamentally different approach to AI processing, combining memory and computation to create a new level of security. This combination of memory and computation provides value to those seeking high-performance, low-power AI applications.  

Many market analysts believe that innovations such as this will change the requirements for AI infrastructure, reduce reliance on traditional cloud-based solutions, and alter how companies economically deploy AI.  

Future Directions and Development  

The researchers are investigating how to continue scaling this technology by increasing memristor density, improving fabrication processes, and integrating the chip into a wide range of devices and platforms. Additional refinements will enable even larger neural networks to support enhanced AI capabilities and broader use in consumer and enterprise devices.  

Moreover, the technology provides an avenue for hybrid AI systems that allow some processing to occur locally, while more complex or aggregated processing can leverage cloud resources, creating a flexible and efficient AI ecosystem.  

Potential Challenges  

Although there is great promise in using memristor-based AI chips on a large scale, there are still many challenges that must be overcome before they can be fully adopted in an everyday consumer setting: manufacturing them at scale, g software, and optimising the way AI models will utilise what is called “in-memory” processing power. All these items will need to be solved by researchers and engineers so that we can make memristor-based AI chips commercially viable and ready for widespread use.  

However, reports from research laboratories indicate significant potential for memristor-based AI chips, and partnerships between chip manufacturers and AI developers may help accelerate the transition from laboratory prototypes to commercially available products.  

Broader Implications  

The breakthrough will impact beyond just the performance of artificial intelligence: it could usher in new standards for energy-efficient computers, secure processing at the device level, and the rapid deployment of intelligent systems. Improving how technology reduces reliance on cloud infrastructure could enable resilient systems, reduce costs, and increase global access to artificial intelligence.  

Smart devices will soon allow individuals and businesses to work with devices that are both efficient, respect privacy and provide quicker insights into their operations than ever before, changing the way Artificial Intelligence becomes a part of everyday life.  

Conclusion: A New Era of AI Efficiency  

The latest memristor-based chip signifies a major step forward for artificial intelligence hardware. The integration of in-memory processing, security, and energy efficiency enables devices to run high-performance AIs without relying on cloud-based services.  

The advantages offered by this innovative memristor chip will enable AI applications to operate more quickly and efficiently, while prioritising user privacy, than ever before. Additionally, they will create a host of new opportunities across a variety of industry sectors, leading to entirely new methods of deploying AI. In continued development, this chip could change our view of the AI landscape, providing powerful, efficient, and secure AI solutions for many more people and devices.

Source: https://phys.org/ 

AI laptops are becoming increasingly popular across America. The fast-growing trend towards AI running on devices rather than in the cloud has led to an explosion of chip makers and tech companies offering a range of chips and systems that can run AI tasks directly on your laptop, reducing your dependence on computing from a datacenter far from your laptop. This shift represents a broad trend in the technology industry to enhance performance, protect privacy, and operate more efficiently by integrating AI hardware and software opportunities into the devices we use every day.  

The Rise of On-Device AI  

On-device artificial intelligence enables devices to perform AI tasks without requiring a connection to an external data center or server. This is particularly useful for using artificial intelligence to process information quickly and to ensure that personal information remains private, without relying on internet connectivity.  

On-device artificial intelligence allows devices to both receive and execute tasks normally associated with transmitting information to and from external systems, thereby providing the ability to perform tasks in “real time”. Real-time tasks include processing spoken words into written words, recognising images of people or objects, and optimising an operating system (OS) for improved efficiency. Such capabilities are especially important for laptop users who want their devices to run seamlessly while performing productivity, creativity, and communication tasks.  

As on-device artificial intelligence becomes more prevalent, there is increasing demand for hardware specifically designed to support its operation.  

Advancements in AI-Focused Processors  

Artificial Intelligence (AI) hardware is emerging as business leaders like Intel, Apple, and AMD lead the charge to deliver AI-enabled CPUs. These new core processors include specialised components, such as Neural Processing Units (NPUs), to improve the speed and power efficiency of AI-based tasks.  

Intel’s Core Ultra processors are single-CPU designs designed to execute AI workloads directly on the computer. This means that applications can use the CPU to run functions such as real-time transcription or photo editing, enabled by advanced image processing techniques. Apple’s silicon also includes a Neural Engine, which enables multiple devices to leverage AI capabilities on or off the device. AMD’s Ryzen AI processor similarly focuses on running AI workloads directly on the device to improve the performance of machine-learning applications.  

These technological advancements will have a significant impact on the development of the next generation of laptops and desktop computers, including the introduction of fundamental AI capabilities.  

Enhancing Performance and Efficiency  

On-device AI offers many advantages. One important advantage is that on-device AI can process data locally and produce results more quickly, with little or no wait time, improving the overall user experience. This is especially true for applications that require quick or immediate processing time, such as speech recognition, video editing, and real-time collaboration tools.  

In addition to faster response time, on-device AI provides for greater energy efficiency. By having dedicated AI hardware designed specifically to perform complex calculations more efficiently than a general-purpose processor, using less power and extending battery life are also advantages.  

The performance gains from on-device AI make AI-enabled computers an ideal solution for both business and home use, as almost any application can run on an AI laptop without affecting the computer’s overall performance.  

Privacy and Data Security Advantages  

The primary reason for the rise in on-device AI use is concern about data privacy. On-device AI uses local data processing rather than sending that information to an external server, which reduces the likelihood that sensitive content will be transmitted externally and placed at risk.  

This is particularly useful in applications that process personal data, such as voice recognition, document analysis, and biometric authentication. This way, users can benefit from the AI service while also having more control over their own data and preventing it from being put at risk.  

With increased government regulation of data protection, on-device AI will remain important for businesses to remain compliant and build customer trust.  

Expanding Use Cases for AI Laptops  

Artificial intelligence laptops offer new uses across a range of fields. In a professional setting, these types of laptops allow for enhanced productivity tools (that include automated workflow), higher levels of collaboration, and the utilisation of AI-assisted applications to create content (e.g., create content using AI). Professionals in creative fields will be able to edit, render and generate content more efficiently, utilising software that takes full advantage of their hardware, which further increases their productivity.  

In educational settings, AI laptops offer users an individualised use differentiation in the classroom. Everyday users will benefit from AI laptops through smart assistants (which will help with tasks), enhanced search capabilities, and more streamlined, automated processes for maintaining the system.  

The increasing popularity of AI laptops can be attributed to their versatility, which enables them to meet diverse user needs.  

Competitive Landscape and Industry Momentum  

The fast-paced development of Artificial Intelligence laptops means that technology businesses are now competing vigorously with one another to make their products stand out through performance, product features, and how they fit into the ecosystem of other manufacturers. Manufacturers who can merge hardware and software capabilities / functionality will have a competitive advantage.  

The key to driving innovation is through forming partnerships among chip manufacturers, software developers, and device manufacturers. These types are utilised/optimised across the various platforms and applications they operate in.  

As the market continues to evolve, there will be a strong focus on delivering an efficient, seamless AI experience for customers, which will ultimately determine a business’s success or failure in this fast-changing marketplace.  

Challenges in Adoption  

Nonetheless, AI laptops offer numerous benefits, but they still face a few barriers to entry that may slow customer adoption. One of these barriers concerns the potential for advanced AI hardware to drive up manufacturing costs. This may subsequently lead to higher consumer prices for these laptops than for traditional laptops.  

To fully benefit from AI hardware capabilities, a software ecosystem must be in place. If the applications available on an AI-enabled laptop are not optimised, the AI hardware will not be used to its full potential.  

Another barrier to realising AI laptops’ full potential is the ongoing challenge manufacturers face in balancing performance with power consumption and thermal management.  

Future Developments in AI Computing  

The development of AI-powered laptops will primarily depend on continued improvements in processor design, software optimisation, and end-user experience. The growing efficiency of AI models will enable the device itself to handle more complex tasks, thereby significantly reducing the need for cloud computing.  

Advancements in emerging technologies such as edge computing and hybrid AI systems will likely enhance the capabilities of AI-enabled laptops. For example, as new technologies emerge, laptops could easily switch between cloud and local processing, depending on the task at hand. Continued innovation will be the key to realising the full potential of AI-enabled devices. 

Sources: Unlock more everything

Apple Newsroom

Apple is increasing its use of on-device artificial intelligence to strengthen user privacy protection across its entire product ecosystem. The move reflects the company’s long-standing position that personal data should remain under user control, even as AI-driven features become more central to modern computing experiences.  

The update builds on Apple’s privacy framework, especially in areas like Siri. The company is shifting more processing to the device rather than to cloud-based systems. This approach reduces data transmission from the device while offering advanced AI functions.  

A Privacy-First Approach to AI  

Apple focuses on processing information through device-based methods, enabling it to handle the largest data volumes. The system requires transmitting data over the internet to remote servers that conduct the analysis.  

Apple uses local data storage to protect its system from external threats while reducing the risk of unauthorised access to data. The company has demonstrated that voice commands and users’ app usage patterns can be analysed without requiring cloud storage.  

This method is of great importance because AI technologies now operate across many daily activities, including messaging systems, search functions, personalised recommendation services, and voice-assistance tools.  

Siri and the Shift to On-Device Processing  

Siri is the main way Apple delivers on-device AI. The company works to ensure more voice requests are handled directly on the device rather than on external servers.  

The new system enables Siri to provide faster responses while providing better protection for user information. The system enables users to process their requests locally, as all web-based requests are unnecessary.  

Apple highlights improvements in how Siri handles user data, including minimising the amount of information it collects and using techniques such as random identifiers instead of personal accounts, performance, and privacy.  

The system faces difficulties because it must continue to perform well while maintaining access to cloud resources. The system needs both efficient hardware and optimised software to process data at local sites, enabling accurate, fast results.  

Apple needs custom silicon, which includes its Neural Engine, because this technology enables the company to achieve its operational goals. The chips enable the company to run advanced artificial intelligence models by processing machine learning operations directly on user devices.  

The system provides advanced features that protect user privacy, which Apple uses to create a competitive advantage against other companies.  

Reducing Data Collection and Storage  

Apple directs its data collection operations to avoid collecting and storing unnecessary information about users. The company processes data in real time to delete information after users complete their sessions, rather than creating comprehensive user profiles.  

The system protects against data breaches and unauthorised access through its security measures. Apple uses aggregation and anonymisation to protect user identity while still using data to enhance its services.  

Apple restricts its data storage practices to ensure its artificial intelligence systems conform to its overall privacy policies across the Apple ecosystem.  

Apple uses its core principles to extend on-device AI beyond Siri across its entire product ecosystem, including iPhone, iPad, and Mac. Local processing powers features such as predictive text, photo recognition, and app suggestions, which increasingly depend on this technology. The system protects personal information because it stores all user data, including messages, images, and usage statistics on the device.  

Deploying on-device AI across products establishes a unified privacy experience for users switching between devices.  

Implications for Users in the United States  

The United States market presents Apple with a unique value proposition, as American customers increasingly worry about data privacy and security. The company develops its solution to safeguard personal information by processing customer data locally.  

The world today places great significance on artificial intelligence, as it now forms an essential part of everyday human activities. The growing use of intelligent systems by users creates a need to implement privacy and security measures during these interactions. Apple’s approach to privacy protection will shape how customers expect products to function while pushing competitors to adopt similar privacy practices. 

Competing Approaches in the AI Landscape  

Apple’s major competitors use cloud AI systems, whereas Apple relies on on-device processing for its AI functions. The systems provide improved computational capabilities, yet they require users to gather more comprehensive data before operation.  

The different methods people choose to reveal how technology companies confront their fundamental industry conflicts. The first approach seeks to achieve optimal system performance through centralised data handling. The second approach enables us to decentralise operations.  

Apple believes that businesses can achieve a competitive edge through privacy protection, which users will increasingly demand as they become more informed about their data privacy practices. 

Challenges and Limitations  

On-device artificial intelligence systems offer benefits but also pose drawbacks. The processing power of local systems falls short of what cloud systems can deliver when handling complex tasks that require extensive resources.  

The limitations of device hardware include both its processing capacity and power consumption. The ongoing challenge involves maintaining efficient AI operations that do not consume excessive system resources.  

Apple must continue improving its technological solutions to strike the right balance between these requirements and its commitment to user privacy.  

The Future of Privacy-Centric AI  

The expansion of on-device AI shows that people now prefer computing systems that protect their private information. The data-handling methods used by everyday devices that incorporate AI technology will determine how much trust users place in them.  

Apple shows that companies can provide smart features through their products without collecting large amounts of user data. This model could become more common as regulations tighten and user expectations evolve.

Source:  Our longstanding privacy commitment with Siri