thisisrajiraj Profile Banner
Raji Rajagopalan Profile
Raji Rajagopalan

@thisisrajiraj

Followers
1K
Following
1K
Media
143
Statuses
825

VP of PM at Microsoft. Head of Product in Microsoft AI Platform. Author of “Daring to be Different". UN/global speaker. Views are my own.

Joined September 2009
Don't wanna be here? Send us removal request.
@thisisrajiraj
Raji Rajagopalan
2 years
Hello everyone who wants to strengthen your career skills & make an impact! My book #2 that helps your learn, reflect, and practice key career skills is out on Amazon. I have written it based on my lessons from over two decades in tech. Check it out! https://t.co/bbWCTtIkIA
1
0
17
@code
Visual Studio Code
1 month
Looking for a seamless way to use local models in your development? With the AI Toolkit extension, you can now access the power of Foundry Local models, directly in @code. Read more in the latest blog: https://t.co/GNhbSoThdZ
8
44
342
@thisisrajiraj
Raji Rajagopalan
3 months
🚀Excited to share that the brand new OSS model from OpenAI is live on Foundry Local, bringing more AI power to your local device. Try it on your PC with these two steps: • winget install Microsoft.FoundryLocal • foundry model run gpt-oss-20b
0
1
8
@thisisrajiraj
Raji Rajagopalan
4 months
Curious about on-device AI? I joined @sethjuarez on the #AIshow to show how you can build&run AI apps locally—with fast setup, cross-plat support & hardware accel. 🚀 Demos on Mac + Windows ⚙️ Azure code runs local ⚡️ CPU,GPU,NPU smart-switching Watch:
0
2
5
@manekinekko
Wassim Chegham
5 months
Introducing our latest open source MCP-powered demo 🎊 Featuring: ✅ @llama_index and @angular ✅MCP in Java, Python, .NET and TypeScript ✅@Azure @OpenAI, @GitHub Models, @Docker Model Runner & Foundry Local ⭐️ Blog: https://t.co/vTg1dadygt ⭐️ Code: https://t.co/EghGczAYjZ
4
26
87
@massimobonanni
Massimo Bonanni
5 months
Join Us for a Technical Deep Dive and Q&A on Foundry Local - LLMs on device https://t.co/s4MXmV4dRh
0
1
1
@vicky_makhija
Vicky Makhija
5 months
Join Us for a Technical Deep Dive and Q&A on Foundry Local – LLMs on device
0
1
1
@MSAzureDev
Microsoft Azure Developers
5 months
Run AI on your device with Foundry Local! 💻 Build fast, private, offline apps with #EdgeAI ⚙️ Power it all up with ONNX Runtime for CPU, GPU, & NPU 🔒 Experience perfection for secure, scalable AI Development Here's what you need to know... https://t.co/8eq6AyxUJs
1
1
10
@thisisrajiraj
Raji Rajagopalan
5 months
And here is the entire video:
0
0
1
@thisisrajiraj
Raji Rajagopalan
5 months
It was a pleasure to talk about Foundry Local at #MSBuild this week. Take a look at a few moments from our session below 👇- we showed how you can use agents using MCP server within Foundry Local. Exciting stuff.
1
0
2
@ankt_srkr
Ankit Sarkar
5 months
🧠 New blog post! Getting Started with Foundry Local + Semantic Kernel I guide you through running LLMs locally and orchestrating them with Semantic Kernel, including code and tips. Ideal for devs building AI apps. 👉 https://t.co/pWJi9yhI9n #dotnet #azure
Tweet card summary image
anktsrkr.github.io
How to use Semantic Kernel in AI development using Foundry Local / OpenAI
0
1
2
@thisisrajiraj
Raji Rajagopalan
5 months
ICYMI, we launched Foundry Local yesterday for fast, private, offline-capable on-device AI at #MSBuild this week. Getting started is easy, and it is available to integrate within apps across Mac and Windows today. Give it a try and give us feedback!
Tweet card summary image
devblogs.microsoft.com
You’re building a next generation AI-powered app. It needs to be fast, private, and work anywhere, even without internet connectivity. This isn’t just about prototyping. You’re shipping a real app to...
0
0
3
@WindowsDocs
Windows Dev Docs
5 months
Want to run AI models locally on Windows? The Windows AI Foundry Local guide walks you through setting up your dev environment, running ONNX models, and building intelligent apps; all without the cloud. Start here:
Tweet card summary image
learn.microsoft.com
Foundry Local is a local version of Azure AI Foundry that enables local execution of LLMs directly on your Windows device.
3
4
29
@HarpalJadeja11
harpaljadeja.eth (evm/acc) 🇮🇳
5 months
Foundry Local! (BIG) Run all Foundry supported models (they have a ton of models) all locally without any subscription on Mac and Windows! https://t.co/ePrcqOCCCF
1
2
7
@kaorun55
中村 薫
5 months
Azure AI Foundry Local すごいな。 秒でローカルLLMができた。
2
78
554
@MaxWinebach
Max Weinbach
5 months
Microsoft is also launching Foundry Local! You can download, convert, and play with models on Windows (and it's better on Copilot+ PCs). This includes NPU optimization to an ONNX format for the WindowsML runtime!
Tweet card summary image
devblogs.microsoft.com
You’re building a next generation AI-powered app. It needs to be fast, private, and work anywhere, even without internet connectivity. This isn’t just about prototyping. You’re shipping a real app to...
0
7
22
@MSCloud
Microsoft Cloud
6 months
Agents are evolving from single-use bots to collaborative systems. The open Agent2Agent (A2A) protocol, coming soon to Azure AI Foundry and Microsoft Copilot Studio, makes it possible. Learn more: https://t.co/0Qwycj9lgC
3
39
116
@thisisrajiraj
Raji Rajagopalan
6 months
ICYMI: we just brought 3 new models to life in the Phi-4 reasoning family. We used distillation, RL, and high quality data in building them & they’re excellent for low-latency agentic apps. Excited about what they unlock for on-device reasoning. 👏🏽🔥 https://t.co/qS3I5YNnUi
Tweet card summary image
azure.microsoft.com
Microsoft continues to add to the conversation by unveiling its newest models, Phi-4-reasoning, Phi-4-reasoning-plus, and Phi-4-mini-reasoning. Learn more.
0
0
1
@thisisrajiraj
Raji Rajagopalan
6 months
Big fan of @lennysan’s podcast. This episode was 🔥
@lennysan
Lenny Rachitsky
6 months
Some key takeaways: 1. https://t.co/SjtYvZgsDS’s culture of radical transparency ensures that every employee has access to real-time metrics, fostering accountability and alignment across the organization. 2. By setting audacious goals—like building 25 features in a single
0
0
2