Ubunta Profile Banner
ABC Profile
ABC

@Ubunta

Followers
4K
Following
4K
Media
351
Statuses
6K

Data & ML Infrastructure for Healthcare Opinions are पड़ोसी' | DhanvantriAI | HotTechStack 📍 🇩🇪Berlin & 🇮🇳Kolkata

Berlin, Germany
Joined August 2009
Don't wanna be here? Send us removal request.
@Ubunta
ABC
9 months
Today in Basel, a team of 24 developers, including myself, built and deployed the following stack to handle 2TB of offline data processing and a 15mbps real-time stream on a bare-metal Kubernetes cluster hosted on Hetzner. The entire setup was achieved on a budget of.
7
12
165
@Ubunta
ABC
1 day
Boom! 🚀 🥳 Our OpenEMR AI Agent with the MCP Server inside DhanvantriAI earned an award — proof that AI-driven innovation is accelerating in healthcare. Built on HOTTECHSTACK platform, our OpenEMR Agent lets healthcare professionals interact with Electronic Medical records
Tweet media one
0
0
0
@Ubunta
ABC
2 days
Navigating enterprise data access is kind of like solving a new puzzle every day—if you love complexity, this is your playground (and occasional insomnia trigger). Here’s the reality as I see it:. – 🔐 SSO & OAuth vs. Headless Tools.Enterprises often standardize on browser-based.
1
0
3
@Ubunta
ABC
2 days
The main reason for using Apache Iceberg is "Because everyone is doing it" 🎯.
1
0
4
@Ubunta
ABC
3 days
After extensive testing of Redis alternatives over recent months, here are the key findings:. - Redis remains the gold standard for ease of use and reliability. However, licensing concerns with Redis Labs' shift to dual licensing prompted exploration of alternatives for.
1
0
7
@Ubunta
ABC
4 days
Rustfs is a superb Apache-licensed alternative to MinIO or LocalStack. I’ve tested it, and it’s impressively fast 🚀. RustFS is a high-performance distributed object storage software built using Rust, one of the most popular languages worldwide. Along with MinIO, it shares a.
1
3
17
@Ubunta
ABC
6 days
Think of a data pipeline like a factory assembly line - it's made up of many smaller workstations (sub-functions), each handling a specific task using different tools and technologies. Just like how a car assembly line has stations for painting, welding, and installing parts, a.
0
1
10
@Ubunta
ABC
8 days
Everyone talks about "democratizing data access" for non-technical users, but that's only solving half the problem. The real challenge isn't just *accessing* the data, it's knowing what to do with it once you have it. Here's what actually happens when you give business users.
1
0
1
@Ubunta
ABC
10 days
MCP Servers function as specialized software engines that enable LLMs to execute specific tasks with high precision and reliability. LLM hallucination typically occurs when models are overwhelmed with excessive context or ambiguous data points, leading to inaccurate outputs.
Tweet media one
1
2
6
@Ubunta
ABC
12 days
Building is believing. I created an MCP Server for Data Warehouses, connected it to live production systems, and ran hundreds of queries to validate what I suspected: this is the future of Business Intelligence and data exploration. What I built:.– Integrated Snowflake,
Tweet media one
5
4
30
@Ubunta
ABC
15 days
Over the past three days, I led AI workshops across Switzerland and Germany with 30+ engineers and master’s students, focusing on deploying MCP servers for GenAI Data applications. While building MCP servers seemed straightforward, the realities of production deployment revealed.
0
1
12
@Ubunta
ABC
20 days
Designing an MCP server for a data-warehouse isn’t trivial—but the payoff is huge. Here’s why I’m convinced this pattern is the future of data access. - Natural language = democratized data.Non-technical users can now query data using plain English. No more SQL barriers, no more.
2
2
19
@Ubunta
ABC
22 days
AI coding tools like Cursor & Windsurf are game-changers for app dev, but Data Engineering? Not so much. - When you're processing TBs of data, testing becomes more critical than code generation. The bottleneck isn't writing SQL - it's ensuring it won't crash your warehouse. -.
2
2
12
@Ubunta
ABC
25 days
Building an MCP Server for Snowflake turned out to be very useful. MCP + Snowflake = Useful. I'll be honest - I initially doubted how useful an MCP (Model Context Protocol) Server for Snowflake would be. But the requests from non-SQL users kept coming, so I built one anyway. -.
1
0
15
@Ubunta
ABC
28 days
I recently tackled an interesting data problem that involved some complex changes, which I managed to resolve successfully. Next, I integrated Prisma with PostgreSQL—surprisingly, this went smoothly with no issues at all. But then came the real challenge: updating the CI/CD.
1
0
9
@Ubunta
ABC
29 days
First came the trusty Data Warehouse. Then someone yelled scale! and everything poured into a Data Lake—.add ACID and voilà: Delta Lake. We bolted on a Lakehouse, waddled off to Duck Lake,.and now we have Lakebase, because every body of water apparently needs branding. Just as.
0
1
9
@Ubunta
ABC
1 month
Introducing the new OpenEMR Agent from HotTechStack — real-time intelligence for your EMR/EHR 🚀. - Full comprehensive histories, lab results, meds, allergies, vitals, encounters and more in a heartbeat. - Deep drill-down analytics — surface trends, risks and care gaps with
Tweet media one
Tweet media two
0
0
7
@Ubunta
ABC
1 month
🚀 In my data pipelines, I need near-perfect record linkage accuracy while keeping the LLM token spend to a minimum. My approach .- Run Splink first – the probabilistic work-horse quickly scores millions of record pairs and nails the obvious matches. This can be dedup or.
1
1
9
@Ubunta
ABC
1 month
Some Key points from Building AI-Driven Data Pipelines with MCP Servers. - MCP Server documentation alone isn't sufficient. When multiple functions perform similar operations, consolidate them into a single, well-designed function. This reduces complexity and AI behaves better.
0
0
5
@Ubunta
ABC
1 month
Data Engineering has seen far more hype cycles than AI ever has. Just look at the parade of buzzwords over the years: Big Data, Spark, Real-time Everything, Data Lakes, Data Mesh, Lakehouses, CDC, Open Table Formats, Massive Clusters, Single-node DuckDB, SQL Everywhere,.
2
3
31
@Ubunta
ABC
1 month
Building AI based Data Applications with MCP Server may lead to latency issues. Here are ways to address them effectively:. - Use an external cache like Dragonfly or Redis to store MCP function results between AI model calls. - Cache based on function name and parameters with.
0
0
6