Japan’s Stockmark launches open-source LLM with 13 billion parameters, less hallucinations

Image credit: Stockmark

Tokyo-based Stockmark, the Japanese startup developing and offering sentence analysis AI solutions for businesses, launched a proprietary large language model (LLM) with 13 billion parameters called Stockmark-13b on Friday. This is the largest scale of any single Japanese language LLM and has been developed with assistance by the AWS LLM Development Support Program. Leveraging disclosed company information and patent updates the company has been collecting, the model is able to return more accurate answers to business questions compared other LLMs by suppressing so-called hallucinations. In addition, the new model can generate text about four times faster than ChatGPT.

In August, Stockmark also introduced gpt-neox-japanese-1.4b, a Japanese LLM with 1.4 billion parameters based on the GPT-NeoX LLM learning library. Subsequently, the company was qualified by Amazon Web Services Japan for its LLM Development Support Program in September where they have been developing an LLM that can answer questions on business with higher accuracy and speed using the cloud platform’s proprietary AI accelerator Trainium. The company intends to build LLM that can be used in practice in specific business use cases, such as new business development, application exploration, and technology development

Stockmark was founded in April of 2016 by four engineers, including CEO Tatsu Hayashi, who had worked for a major Japanese trading company. Having developed various services by recommending and structuring disclosed news updates from websites, the company is currently offering a news clipping service for businesses called Anews as well as a tool visualizing market trends and competitors’ movements through structured analysis of business news updates called Astrategy. The company claims that about 25% of the companies nominated for the Nikkei 225 index use any of Stockmark’s products.

via PR Times