Newsletter summary

Text of news flash

Magic square quantization open source second generation MoE model DeepSeek-V2CryptogameplatformOn May 6, private equity giant Magic Quantification announced through its official Weibo that its new organization, Deep search, has officially opened up its second-generation MoE (Mixture of Experts) model, DeepSeek-V2. According to Magic Square Quantification, the API price of DeepSeek-V2 is 1 yuan for input and 2 yuan for output per million tokens, which is significantly lower than that on the current market.CryptogameplatformHis similar products, only nearly 1% of the price of GPT-4-Turbo, show their competitiveness in the market.

The open source DeepSeek-V2 model is another important exploration of magic square quantization in the field of artificial intelligence. As an advanced artificial intelligence algorithm, MoE model can deal with complex tasks by combining multiple expert models. Through the organization of "in-depth search", Magic Square quantization constantly promotes the progress of MoE model technology, and provides a new choice for the industry with its excellent performance and competitive price.

This measure of magic square quantization will undoubtedly have a positive impact on the technological development in the field of artificial intelligence. The open source DeepSeek-V2 model will help to promote the technological progress of the industry, at the same time, its competitive pricing will also attract more enterprises and developers to use, and further promote the popularization and application of artificial intelligence technology.