Loading…
Attending this event?
October 22-23, 2024
Santa Clara, CA
View More Details & Registration

The Sched app allows you to build your schedule but is not a substitute for your event registration. You must be registered for RISC-V Summit to participate in the sessions. If you have not registered but would like to join us, please go to the event registration page to purchase a registration.
Tuesday October 22, 2024 2:35pm - 2:53pm PDT
The advancement of large language models (LLMs) has significantly enhanced natural language processing capabilities, enabling complex text understanding and generation tasks. This presentation focuses on optimizing the open-source LLaMA CPP project for the RISC-V P extension. By running the TinyLLaMA 1.1B model on the Andes Voyager development board using a quad-core CPU supporting the RISC-V P extension, performance results show that the model can achieve near real-time response. This work highlights the potential of RISC-V as an efficient platform for deploying advanced AI models in resource-constrained environments, contributing to the growing field of edge computing and embedded AI applications.
Speakers
avatar for Yueh-Feng Lee

Yueh-Feng Lee

Manager, Andes Technology
Yueh-Feng Lee received his Ph.D. degree in computer science from National Chiao Tung University. He previously worked at Mediatek and Industrial Technology Research Institute. His areas of focus include AI compiler and runtime, hypervisor technology, and embedded systems.
Tuesday October 22, 2024 2:35pm - 2:53pm PDT
Grand Ballroom G (Level 1)
  AI / ML

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link