Meet Gemma: A Cryptocurrency Mogul with 2 Billion 7 Billion and 6 Trillion Tokens

Posted by

Introducing Gemma – 2B 7B 6Trillion Tokens

Welcome to Gemma – The Future of Tokenization

Gemma is a revolutionary new platform that aims to change the way we view tokens in the digital world. With a total supply of 2 billion, 7 billion initial circulating supply, and 6 trillion tokens in the market cap, Gemma is set to make a huge impact on the blockchain industry.

What is Gemma?

Gemma is a decentralized platform that allows users to tokenize assets, trade tokens, and interact with other users in a secure and transparent environment. By utilizing blockchain technology, Gemma ensures that transactions are secure, instant, and efficient.

Key Features of Gemma

  • Tokenization: Gemma allows users to tokenize any asset, whether it be real estate, art, or intellectual property. This opens up a whole new world of possibilities for asset ownership and investment.
  • Trading: Users can easily trade tokens on the Gemma platform, creating liquidity and value for token holders.
  • Security: Gemma uses advanced security measures to ensure that transactions are secure and private.

Get Started with Gemma Today

If you’re ready to experience the future of tokenization, sign up for Gemma today and start exploring all the possibilities that this platform has to offer. With 2 billion total tokens, 7 billion initial circulating supply, and 6 trillion tokens in the market cap, Gemma is set to revolutionize the way we view tokens in the digital world.

0 0 votes
Article Rating
38 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
@user-yd4tl9vw1r
3 months ago

thanks!

@IdPreferNot1
3 months ago

Ollama already has on its model page.. just pick the one you want and run on ollama with 3 words.

@motbus3
3 months ago

This is a major potentially fatal blow to freedom on the internet.

These companies are extracting data while people do not add license terms to its usage regarding ML models.

They will be the sole ones making the content online because it will not generate not enough revenue before they absord your content and reproduce.

These models are not creative and all they do is a automated plagiarism and idea stealing and ofc the heads of those companies knows it.

@user-nm2gz5ce3q
3 months ago

nice

@dataprospect
3 months ago

Dont forget starcoder and santacoder models. They are among the earliest opensource models that standardized data quality checks and pipelines. And inspired so many new models.

@user-lv5kh8lb7f
3 months ago

like your video😃

@chiaracoetzee
3 months ago

fyi you say the weights are only English but in my tests it was able to respond to queries in French. It's possible they were going for an English-only dataset but accidentally brought in some other language data.

@nannan3347
3 months ago

No support for Llama.cpp, LMStudio, Oobabooga, etc. its a very Google move to release something “open source” that is siloed from the entire open source community.

@imedgheb7654
3 months ago

HuggingFace Version is Here ! : )

@davk
3 months ago

Gemini is getting significantly worse now. The same was with GPT3 which despite upgrades lost a lot of quality.

@Wanderer2035
3 months ago

It’s censored so it’s not really that good

@picklenickil
3 months ago

My guys going total pokemon on this.

Evolution after evolution

@ShanyGolan
3 months ago

Tried 2b. Wow it sucks. 😅😅
I asked him the derivative of x^3, it couldn't do it. Lol. What??

@2beJT
3 months ago

Google: "Gemma"
Me: Gimmie
Google: NO, GEM-MA.. GEMMA!
Me: Gimmie Gimmie

@just.play1ng
3 months ago

Is this real 😂?

@user-qr4jf4tv2x
3 months ago

6T you mean i can just plug an entire book in a single prompt

@russelllapua4904
3 months ago

why tf did they name this Gemma?

@MrErick1160
3 months ago

Can you give some practical applications of such model? I'm data science student andlooking at how to use these models for meaningful purposes

@micbab-vg2mu
3 months ago

Thank you for the great video:)

@ThoughtLineQuotes
3 months ago

Really cool I thought there were 1 million tokens. Thanks for the video.