(Illustration: Lac de Neuchatel, Switzerland. Image source: Ernest)
Prelude
Compilation of Llama 1 vs. Llama 2 comparisons and notes on the LLAMA 2 COMMUNITY LICENSE AGREEMENT. More models and products of Meta AI, please refer to this study notes.
Llama 1
- Blog = Introducing LLaMA: A foundational, 65-billion-parameter language model on 2023-02-24
- Paper = [2302.13971] LLaMA: Open and Efficient Foundation Language Models
- License = GPL v3
Llama 2
- Blog = Meta and Microsoft Introduce the Next Generation of Llama on 2023-07-18
- Paper = [2307.09288] Llama 2: Open Foundation and Fine-Tuned Chat Models
- License = Llama 2 Community License Agreement
- After completing the Request form, you will receive an email explaining how to agree to the license and download the models.
LLAMA 2 COMMUNITY LICENSE AGREEMENT
llama/LICENSE at 6d4c0c290aeec1fa4399694fefb864be5a153bb6 · facebookresearch/llama
I have marked the passages that have been discussed more by everyone. Please refer to the link above for the full text of the License.
- You are granted a non-exclusive, worldwide, non-transferable and royalty-free limited license under Meta’s intellectual property or other rights owned by Meta embodied in the Llama Materials to use, reproduce, distribute, copy, create derivative works of, and make modifications to the Llama Materials.
- distributable.
- The licensee uses Llama Materials to make derivatives, and the end users of the derivatives are not eligible for Section 2? (Section 2 may only govern institutions, not end users?)
- Llama cannot be used to improve other LLMs.
- More than 700 million MAUs have to apply for a license. (Section 2)
Llama 1 vs Llama 2 (Comparison)
Discussion
- Although the word “open source” is controversial and has various interpretation angles 1, but at this step of Meta AI, it may be possible to obtain the early market share of Edge AI?
- Below 700 million MAU, please don’t think too much, just create and play?!
- Looking forward to more tests and comparison results. Each models or tools should have their advantages, disadvantages, and applicability.
- On the same day (2023-07-18) AWS Blog also published an article Llama 2 foundation models from Meta are now available in Amazon SageMaker JumpStart | AWS Machine Learning Blog, it is speculated that Meta should also work closely with AWS to be published simultaneously on 2023-07-18? However, Meta Blog and press release are jointly named with Microsoft, which is quite interesting.
Meta launches Llama 2, an open source AI model that allows commercial applications | Ars Technica
Although Llama 2 is open source, Meta did not disclose the source of the training data used in creating the Llama 2 models, which Mozilla Senior Fellow of Trustworthy AI Abeba Birhane pointed out on Twitter. Lack of training data transparency is still a sticking point for some LLM critics because the training data that teaches these LLMs what they "know" often comes from an unauthorized scrape of the Internet with little regard for privacy or commercial impact.
↩︎