SigOpt
@SigOpt
Followers
4K
Following
884
Media
559
Statuses
2K
SigOpt, which offers a scalable model experimentation and optimization platform, was acquired by Intel October 2020.
San Francisco, CA
Joined September 2014
Hello, modelers — we have an announcement to share. It’s time for SigOpt to wind down. After a good 9-year run, we are planning to shift our efforts internally.
6
0
4
Because of this, we will also be pausing our efforts on social media until there is more to share. Goodbye (for now) and thank you for all of your support over the years! 💙
0
0
0
While we will no longer offer support and updates after September 2023, modelers can continue to access the free and open source versions of SigOpt on our website until further notice:
1
0
0
It’s been a pleasure building this product for you, and we hope you continue to use SigOpt’s tools to optimize and accelerate your modeling projects.
2
0
0
What is distillation? In this short video, Meghana Ravikumar explains how distillation transfers the knowledge from a large model to a much smaller one, using BERT as an example:
0
0
0
Modelers can use SigOpt for nearly anything: #DeepLearning, #MachineLearning, or even Airplane Design. Check out our sample use cases for more examples of how to use SigOpt for your business: https://t.co/uOC13BT3lz
0
0
0
Implement SigOpt with just a few lines of code. Instrument your model code to track runs and model artifacts—here's how to get started: https://t.co/EQ5z1cL5Er
0
0
0
“Integrating SigOpt into our modeling platform empowers our team to more efficiently experiment, optimize, and ultimately, model at scale.” – Peter Welinder, Research Scientist @OpenAI Learn how SigOpt helps teams accelerate their model development: https://t.co/u2sYWpTO5w
0
0
1
An All Constraints experiment can help modelers study which parameter regions consistently yield high-performing models. Learn how to use this advanced experimentation technique using SigOpt: https://t.co/IMaw1GcmDH
0
0
1
“We’ve integrated SigOpt’s optimization service and are now able to get better results faster and cheaper than any solution we’ve seen before.” – Matt Adereth, Managing Director, @twosigma Learn how SigOpt can help you amplify the impact of your models: https://t.co/bgrc34MFJR
0
0
0
Parameters are a crucial part of every experiment, defining the domain to be searched – which is why SigOpt supports double, integer, and categorical parameter types. Learn more about SigOpt's tools to construct a domain for your specific modeling problem: https://t.co/mKnX0HHZfI
0
0
0
See how SigOpt stacks up. In this short video, Associate Professor Paul Leu walks through his test comparing two popular optimization techniques using SigOpt's intelligent experimentation platform to empirically determine the best-performing algorithm:
0
0
1
SigOpt offers two API modules: Core Module and AI Module. Not sure which one is right for your #ML project? Check out our guide here: https://t.co/I5wlfyrUt3
0
1
0
Constraint Active Search offers an alternative to working with the Pareto efficient frontier, making it an ideal approach for material sciences and production. In this video, Gustavo Malkomes shares some of SigOpt's latest research on CAS:
0
0
0
Did you know that you can bring your own optimizer to SigOpt? Check out our quick-start guide to using your own optimizer and storing your progress in SigOpt: https://t.co/r7k2JnbayW
0
0
0
How are you using SigOpt open source? With our new open source offering, teams can run their own self-hosted servers—meaning your data doesn't leave your server. Learn more: https://t.co/Ov8NixVint
0
0
0