Amazon outsources all of their processors, RAM and internal storage to a myriad of company’s for their Fire Tablets and Kindle e-readers. The company could possibly design their own ARM processors with a system on a chip, similar to the Apple M1. Instead of spending months, or even years designing them, they could leverage AI with their AWS supercomputers to create them in hours. This will lower the cost of upcoming tablets and e-readers and also make them more responsive.
Amazons future plans are not as far fetched as they sound. Google announced a collaboration with Samsung to develop an in-house CPU to compete with the A14 Bionic and Snapdragon 888. The first-ever Google-designed chipset, will appear in the Pixel 6 series this autumn. The next-generation chipset is being designed by Google utilizing Artificial Intelligence (AI). And the AI takes only six hours to build, compared to months for human designers. Previously, Floorplanning has been a highly manual and time-consuming task. Teams would split larger chips into blocks and work on parts in parallel, fiddling around to find small refinements. What Google has done, is turn the floorplanning problem into a task for a neural network. It treats a blank chip and its millions of components as a complex jigsaw with a vast amount of possible solutions. The aim is to optimize whatever parameters the engineers decide are most important, while also placing all the components and connections between them accurately. Basically, an AI designs the chip, instead of people, which means the SOC can be manufactured quicker.
Google isn’t the only company that has a myriad of AI and neural network. Amazon has Deep Learning and Tensor Flow. They also have various AI systems in place for people to make comprehensive systems. I am fairly certain that Amazon can design their own system on a chip using their own AWS infrastructure.
For the Amazon Kindle line of e-readers, most of the cost associated with buying a new piece of hardware is the NXP processor that is packed inside. They also need to source RAM, storage, motherboard, battery and everything else. The Fire line of devices are a bit more complex, most of their processors are all ARM based chips, but they have always used different components and displays. This drives the cost of the tablet from $99 to $399.
One of the benefits of a system on a chip, is that the RAM, storage, processor and graphics are all on one chip. All of the computational processes would have less travel time, since they interconnected. An SoC can integrates with peripherals like Wi-Fi and cellular network radio modems, for the Kindles that have 4G/LTE internet access. You can learn more about SOC HERE.
Many company’s are leaping on developing their own SOC, since their is a ton of benefits. Aside from Apple, who have a proven track record with their M1 chips, Intel announced they are developing a series of chips that will be released in the next year or two. Microsoft is designing their own for their own Surface products. Since Samsung is manufacturing the new Google SOC, it won’t be long before they make their own.
I think Amazon would be best served to leverage all of their machine learning and neuro networks to design a system on a chip. This would be cheaper than buying the hardware piecemeal, and also save on the licensing fee they have to pay NXP for the Kindle processer. Having the RAM/Processor/WIFI/LTE and everything else on the same chip would lower costs, making future Kindles and Fire tablets cheaper and more power efficent.
Michael Kozlowski is the editor-in-chief at Good e-Reader and has written about audiobooks and e-readers for the past fifteen years. Newspapers and websites such as the CBC, CNET, Engadget, Huffington Post and the New York Times have picked up his articles. He Lives in Vancouver, British Columbia, Canada.