06-05, 09:15–09:55 (Europe/Stockholm), Main stage
In this presentation, we aim to showcase a module that bridges the gap between the past and the future of computing. Our focus is a fully functional cartridge designed for the Commodore 64, equipped with a Large Language Model (LLM) trained on custom datasets derived from a collection of 400 books about the Commodore 64 from the 1980s. This module interfaces with the Commodore 64 as seamlessly as any hardware from the era, operating without the need for cloud connectivity and drawing power from original sources.
The submission revolves around the development and implementation of a compact module designed in the form of an 80s-style cartridge compatible with 8-bit computers, specifically the Commodore 64. This module is not just a nod to the past; it's a fully functional piece of modern technology that enables a vintage computer to interface with a dedicated Large Language Model (LLM). Here are the technical details of our project:
Hardware Design and Integration:
Cartridge Form Factor: The module is designed to mimic the physical and electrical specifications of cartridges from the 1980s, ensuring it slots into the Commodore 64 without modifications to the host system. This form factor was chosen for its nostalgia factor and ease of use, allowing users to engage with the technology in a familiar way.
Power Supply: It draws power directly from the Commodore original power supply, ensuring that no additional external power sources are required. This decision was made to keep the setup simple and true to the original hardware experience.
Development Board: At the heart of the module is one of the most powerful development boards currently available, selected for its superior processing capabilities, which are necessary to run the LLM. This board was chosen for its ability to operate within the power and space constraints of the cartridge form factor while providing enough computational power to process LLM tasks in real time.
Software and LLM Integration:
Custom Large Language Model: The LLM is specially trained on a dataset compiled from over 400 books about the Commodore 64 and computing in the 1980s. This custom training approach ensures that the model is highly specialized, capable of understanding and generating content that is relevant to the Commodore 64's hardware, software, and cultural significance.
Model Optimization: Given the limited resources available on the development board, significant effort has been put into optimizing the LLM for efficiency. This includes compressing the model to fit the hardware constraints without significant loss in performance and customizing the inference engine to run efficiently on the board's processor.
Communication Protocol: The module communicates with the Commodore 64 using a protocol designed to maximize the data throughput while minimizing latency. This protocol meets communication standards of the 80s, ensuring compatibility with the Commodore 64's architecture.
Development Challenges:
Hardware Selection: Identifying a development board that fits the physical constraints of a cartridge while offering enough computational power was a significant challenge. The board needed to be small, energy-efficient, and capable of running complex AI models.
Software Optimization: Tailoring the LLM to run on limited hardware required innovative approaches to model compression and optimization, ensuring that the module could deliver real-time responses without compromising the user experience.
Integration with Vintage Technology: Ensuring seamless communication between modern AI technology and the decades-old architecture of the Commodore 64 presented unique challenges. This required a deep understanding of the Commodore 64's hardware and software, as well as creative engineering solutions to bridge the technological gap.
This project represents a unique fusion of old and new, demonstrating the potential of modern AI technologies to interact with and enhance vintage computing platforms. By carefully selecting and optimizing our hardware and software components, we have created a module that not only respects the legacy of the Commodore 64 but also extends its capabilities into the realm of artificial intelligence.
He gained experience in both small companies and large corporations, such as Samsung, Trustwave and Microsoft, working as an engineer, pentester and manager. In the security industry for more than 14 years. Experience in the area of penetration testing, reverse engineering or vulnerability finding. Multiple conference speaker in Poland (Confidence, WTH) and abroad (HiTB, PacSec, DefCamp, H2HC, BlueHat)
Co-creator of the AI project called ChatNMI
Author of the chapter 'Hashcat – A Race Against Time in the Function of Power and Resources' in the book 'Introduction to IT Security' by Sekurak, and the creator of ChatNMI, an innovative open-source, homebrew application designed for transparency and wide accessibility, integrating modern AI models. An Offensive Security expert and conference speaker, I possess extensive experience in IT Security Incident Response, Threat Hunting, IT Forensics, and network/infrastructure security. Actively engaged in AI-related initiatives and discussions, I am a recognized expert with the AI Security Foundation, contributing to the development of secure and ethical practices in the deployment of artificial intelligence technologies.