orioe33 Profile Banner
Eros Orion Profile
Eros Orion

@orioe33

Followers
12
Following
0
Media
13
Statuses
130

Sharing interesting stuff here.

Joined July 2025
Don't wanna be here? Send us removal request.
@orioe33
Eros Orion
5 hours
Which stage resonates with you the most? Share your thoughts! šŸ‘‡ #AI #MachineLearning #GenerativeAI #IBMTechnology.
0
0
0
@orioe33
Eros Orion
5 hours
This framework might just be what you need to find the winning combination of models and use cases.
1
0
0
@orioe33
Eros Orion
5 hours
Remember, a multi-model approach can be beneficial. Different use cases may require different models for optimal performance.
1
0
0
@orioe33
Eros Orion
5 hours
Finally, be mindful of deployment. Will you use a public cloud or go on-premise? Each option comes with its own trade-offs in terms of security and control.
1
0
0
@orioe33
Eros Orion
5 hours
Stage 5: Choose the model that delivers the most value. It’s all about aligning performance with your specific needs and context.
1
0
0
@orioe33
Eros Orion
5 hours
Stage 4: Test the shortlisted models against your identified use case. Run some trials to see how they perform in real-world scenarios.
1
0
0
@orioe33
Eros Orion
5 hours
Stage 3: Evaluate each model's size, performance, costs, risks, and deployment methods. This step is crucial to understanding the landscape of your choices.
1
0
0
@orioe33
Eros Orion
5 hours
Stage 2: Compile a shortlist of available models. Are there foundation models your organization is already using? This will help narrow down your options.
1
0
0
@orioe33
Eros Orion
5 hours
Stage 1: Clearly articulate your use case. What do you want your AI to do? This is the foundation upon which everything else will be built.
1
0
0
@orioe33
Eros Orion
5 hours
Instead, let’s explore a straightforward AI model selection framework with six essential stages. Ready? Let’s break it down!.
1
0
0
@orioe33
Eros Orion
5 hours
But here’s the kicker: bigger isn’t always better. šŸ“Š While massive models may seem appealing due to their parameter counts, they come with hefty costs—compute, complexity, and variability.
1
0
0
@orioe33
Eros Orion
5 hours
If you’re diving into generative AI, choosing the right foundation model isn’t easy. With countless options available, the stakes are high—pick wrong, and you may face biases or hallucinations.
1
0
0
@orioe33
Eros Orion
5 hours
🧵 The most critical guide to selecting the right AI foundation model for your use case:
Tweet media one
1
0
0
@orioe33
Eros Orion
2 days
What are your thoughts on the impact of Large Language Models? šŸ¤” Drop your comments below! And if you found this thread insightful, retweet and follow for more! #AI #MachineLearning #BusinessInnovation.
0
0
0
@orioe33
Eros Orion
2 days
They can also assist in software development by generating and reviewing code. And this is just the beginning! šŸš€ As LLMs continue to evolve, we’ll uncover even more innovative uses.
1
0
0
@orioe33
Eros Orion
2 days
Now, let’s talk about business applications! LLMs can create intelligent chatbots for customer service, generate content for articles, emails, and even YouTube scripts! šŸ“ˆ.
1
0
0
@orioe33
Eros Orion
2 days
During training, LLMs learn to predict the next word in a sentence, adjusting their parameters to improve accuracy. Think: "The sky is. "—with enough iterations, they can confidently conclude: "blue!" 🌈.
1
0
0
@orioe33
Eros Orion
2 days
How do they work? šŸ¤– It's all about three components: data, architecture, and training. The transformer architecture enables LLMs to understand word context within sentences, making predictions more accurate.
1
0
0
@orioe33
Eros Orion
2 days
LLMs are not just huge in size; they're also complex in nature. The most advanced models, like GPT-3, are pre-trained on 45 terabytes of data and utilize 175 billion parameters! 🌐 That's a lot of information to process!.
1
0
0
@orioe33
Eros Orion
2 days
We're talking about models that can be tens of gigabytes in size and trained on petabytes of text data! šŸ“š To put that in perspective, 1 GB can store about 178 million words—imagine the scale we're dealing with!.
1
0
0