Skip to content

Large Language Models

It is no secret that I've been very enthusiastic about The Commercial NLP Renaissance a.k.a. Large Language Models (yet a bit skeptical about the type of people its attracting) .

I've been building a lot of stuff with them, mainly Printloop, but also BORI.

This week is expected to be big for language models as GPT-4 is rumored to be announced, but also high-performance language models are very close to land in consumer hardware!

Language Computer

I've been looking for ways to build a "Language Computer" for my home; I want to run a service on my own hardware that I can use to build personal language tools, services, and home automation workflows.

I also want to host other AI services at home (like speech-recognition), but my focus is on language models first because I believe it's the most useful.

Turns out, I am very close to making this a reality! We'll soon be able to run a language model with performance comparable to OpenAI's Davinci GPT models on our own hardware.

I highly recommend reading Simon Wilson's "Stanford Alpaca" post discussed the acceleration of large language model development on-device, particularly talking about Stanford Alpaca and llama.cpp.

I already so many ideas for things I will be able to do with my own language service at home:

  • Digitize, summarize, categorize, and sort my snail mail
  • Digest, summarize, and present the news to me
  • An army of mechanical historians that indefinitely browse the web, recording findings about topics that I am interested in, keeps track of trends, and writes me a report every day