June 14, 2024

Light of the Future: The Rise of Optical Computers in Artificial Intelligence Applications

Hello Tomorrow Turkey

Optical computers and large language models (LLMs) present potential solutions to the high energy consumption and high speed requirements of artificial intelligence. These technologies can perform more efficiently and quickly by harnessing the power of photonics and utilizing advanced language processing techniques.


The Moore's Law states that computer chips will contain twice as many transistors every two years, leading to significant leaps in speed and efficiency. However, the computational demands of the deep learning era have already surpassed this speed, with needs in this field growing at an unsustainable rate. The International Energy Agency predicts that artificial intelligence will consume 10 times more power in 2026 compared to 2023 and that data centers will consume as much energy as Japan that year. Nick Harris, the founder of the unicorn startup Lightmatter, states that the computational power required by artificial intelligence is doubling every three months, and this growth will challenge companies and economies.

Lightmatter's technology


At this point, the idea of using photons instead of electrons for information processing is noteworthy. It is stated that optical computers could be groundbreaking in artificial intelligence applications that require high speed and high efficiency. It has been revealed that optical computers show great potential, especially in matrix multiplication operations.

Matrix multiplication is a fundamental step used both in the training of artificial neural networks and in processing new data. Since the multiplication of very large matrices requires high computational power, new algorithms are continuously being developed in this area.


Optical signals can carry more information than electrical signals and, by operating at higher frequencies, can perform more operations in shorter time. Additionally, unlike electronic chips, optical computers can theoretically perform more operations simultaneously while using less energy.


Recently, researchers have developed various types of optical computers. In a notable study among these applications, Zaijun Chen and his team from MIT introduced a new optical network called HITOP. HITOP can run machine learning models at a scale 25,000 times larger than previous chip-based optical neural networks.


Moreover, LLMs are also revolutionizing the field of natural language processing and can generate human-like texts by being trained on very large datasets. LLMs are used across a wide range of applications, from text generation to translation, and from answering questions to summarization. However, training and running these models require a substantial amount of energy and computational power. Optical computers could play a crucial role in reducing the training and operational costs of such large language models.


However, optical computers are still far from competing with electronic chips outside the laboratory. These systems generally operate with outdated network designs and small workloads. Nevertheless, the development of large optical systems could make certain artificial intelligence models up to 1,000 times more efficient than electronic systems in the future.


If you are pushing the boundaries of artificial intelligence and large language models like the scientists developing optical computers and are generating new solutions in these fields, join the LAUNCH LLM Impact Program and be a part of this innovative journey!


The application deadline is June 17.