admin
April 10, 2023, 9:41pm
1
LLM Models
There are now more and more high quality open Models available to Compute Owners e.g.
Above models are all available in Ollama (see below).
Model Providers
There are many Model Providers for the different Models e.g.
Ollama is the default as of 2024-08-24.
Model Interfaces
Must support Retrieval Augmented Generation (RAG) since traditional LLMs are difficult to for non-technical Compute Owners to customise (e.g. fine-tuning GPT-3.5 )
admin
February 15, 2024, 3:05am
2
Mix and Match AI
Below are some preferred AI Model standards:
1. Model File Format
2. Parameters
5B or above
If possible turn to use LLM with at least 5 billion parameters - the higher the better the quality but uses more resources.
3. Quantization
admin
July 20, 2024, 7:08am
3
Translation
Collections
Models
BigTranslate
BigTranslate is from Institute of Automation of the Chinese Academy of Sciences (CASIA ).
Code:
References:
Llama 3
Llama 3 supports multiple languages:
English
Spanish
French
German
Italian
Portuguese
Dutch
Russian
Chinese
Japanese
Korean
As of 2024-07-20 it has 3 different sizes: 8 Billion (available), 70 Billion (available), 400 Billion (almost there!) parameters.
References:
admin
July 20, 2024, 7:51am
4
Open Source Models
Despite its Open AI name, the ChatGPT it developed is not open sourced, but open sourced Large Language Models are being developed quickly by others:
1. Llama
As of 2024-08-12 the default LLM model is Llama 3.1
Utilities intended for use with Llama models.
Data Cut-off Month: 2023-12
1. Stanford Alpaca
Code and documentation to train Stanford's Alpaca models, and generate the data.
Promising for non-commercial applications.
Interesting how they took it down after short time online:
Can be used on less powerful hardware:
Instruct-tune LLaMA on consumer hardware
2. FLAN UL2
Promising for commercial applications.
20 Billion parameters can be a bit heavy but the gains may be worth it over the older and leaner FLAN-T5 it is based on.