Explore tweets tagged as #Datasets
Free Datasets to practice data analytics projects 1. Enron Email Dataset Data Link: https://t.co/DG1aEo3RTh 2. Chatbot Intents Dataset Data Link: https://t.co/DKZlo6J76s 3. Flickr 30k Dataset Data Link: https://t.co/0PdyEl0yvJ 4. Parkinson Dataset Data Link:
11
151
718
Kinda wild how under the radar $OPAN @Opanarchyai is right now, sitting under $1m mcap while quietly building what feels like the Hugging Face of robotics. If Hugging Face became the go to hub for AI models and datasets, OPAN is doing the same for robotics, creating an open
3
7
25
AI agents are absolutely crushing it in DeFi rn ๐ฅ These bots are executing trades in milliseconds, analyzing massive datasets, and eliminating human error while we sleep @Velvet_Capital DeFAI is the future... no cap Are you bullish on AI powered portfolio management or still
8
0
9
๐จ $NATIX is a great example of how Web3 can solve real-world problems. One of the biggest challenges in autonomous driving is data collection. Itโs incredibly difficult and expensive. Most open source datasets only have a few thousand hours of driving data. โช๏ธ
4
2
6
$RCHV Archivas is a decentralized storage layer on $BNB Chain, using PoIS to create "living memory" that verifies and adds meaning to data. As BSC's first, it's AI-powered, rewarding useful items like models and datasets, not random bytes while burning $RCHV tokens per
2
14
32
Data has its own kind of gravity the more you collect the stronger it pulls Thatโs why large datasets naturally attract apps, users, and infrastructure they create their own orbit of value and interaction @genome_protocol studies this effect to build fairer data ecosystems
40
1
42
Pre-training Objectives for LLMs โ Pre-training is the foundational stage in developing Large Language Models (LLMs). โ It involves exposing the model to massive text datasets and training it to learn grammar, structure, meaning, and reasoning before it is fine-tuned for
20
45
298
Deepfake tech, but for everyone. No datasets. No setup. No patience needed. Upload one photo โ watch AI do the rest. @higgsfield_ai | #HiggsfieldFaceSwap
11
4
21
Thinking like a Data Analyst didnโt come easily for me. I'd spend hours staring at datasets, unsure what questions to ask. I knew I had to change if I wanted to have a successful career in data. Luckily, I found a way through. Here's what I did in 4 steps:
10
58
432
#Laravel lazy() vs get() Did you know.... You can stream large datasets from the DB using lazy() โ way more memory-efficient than get().
3
15
114
๐๐This step-by-step guide will explore the intricacies of analyzing complex survey data using the powerful R programming language. https://t.co/hKrDZvpIIF
#DataScience #rstats #DataScientist #StatisticalLearning #machinelearning #datasets #datavisualizations
3
40
160
Speaking at the ๐ฎ๐ป๐ฑ ๐๐ป๐ป๐๐ฎ๐น ๐๐ ๐ถ๐ป ๐๐ฒ๐ฎ๐น๐๐ต ๐๐ณ๐ฟ๐ถ๐ฐ๐ฎ๐ป ๐๐ผ๐ป๐ณ๐ฒ๐ฟ๐ฒ๐ป๐ฐ๐ฒ in Kampala, ๐ผ๐๐ฟ ๐๐ ๐ฆ๐๐๐๐ฒ๐บ๐ ๐๐ป๐ด๐ถ๐ป๐ฒ๐ฒ๐ฟ, @dylan_katamba, tackled a deeply important topic: how to prevent bias and ensure transparency in healthcare AI datasets and
0
6
11
Datasets are cited in countless ways โ acronyms, aliases, partial names. #AI can learn to recognize them all, thanks to synthetic training data that mirror real citation patterns. Here we explain how: https://t.co/NXgVukhOrW
0
2
3
Long-term monitoring tells a story of change. For 80 years scientists have monitored #Windermere and nearby lakes in Cumbria, tracking how climate change and pollution are reshaping our freshwater ecosystems. Itโs one of the longest lake datasets anywhere in the world! ๐ 1/
1
1
3
One prompt = one model, or how Domain-Specialized Meta-Agents make it possible In current AI systems, if you want to create an agent for a specific direction, you need to train it, spend time, data, money, and compute... Each agent = a new model, new datasets, and new resource
18
0
24
Decentralised Robotics ๐งต I/ Building datasets for embodied AI is toughโhumanoid robots need real-world human motion task data, but collecting it at scale has been limited to research lab projects or closed source big labs. At Eidon, we started with our wearable IMU trackers.
3
14
64
๐จ Over three insightful sessions, the data.europa academy hosted the workshop 'Visualising Data for Impact' with Alberto Cairo, empowering participants to turn complex datasets into clear, ethical, and engaging visuals. Read more ๐ https://t.co/rw9UTAccVO
0
3
4
Ritual Data Provenance: verifiable datasets for AI Models are only as trustworthy as the data that shapes them. Today that data moves through scripts and storage buckets with little trace of origin or integrity. Ritual Data Provenance makes datasets first class on chain
12
0
15