
Ho Lab
@lab_ho
Followers
578
Following
1K
Media
102
Statuses
456
Laboratory for development of novel soft robotic mechanisms at JAIST (北陸先端科学技術大学院大学)
Joined January 2021
📢【【論文掲載】ロボット全身の“視覚ベース触覚・近接覚”技術が IEEE Transactions on Robotics に採録されました。#Robotics #TactileSensing #EmbodiedAI #IEEE_TRO.詳細はこちら👉.論文:プロジェクトHP:
0
1
24
RT @TohoUnivRA: 過去の科研費説明会資料を公開することにしました。特別な秘密などなく、常識的なことばかりだと思います。 URAによる科研費セミナー「研究支援で見えてきた みんながハマる 科研費申請書作成の落とし穴」発表スライド .
0
390
0
Developing more capable physical systems is critical to offload low-level interactions, thereby allowing AI to devote its resources to higher-order reasoning. より賢い物理システムをつくると、細かい相互作用を任せられるようになり、その分AIは高いレベルの思考や判断に集中できるように.
Today’s Humanoid Robots Look Remarkable—but There’s a Design Flaw Holding Them BackBeyond brains, robots desperately need smarter bodies.
0
0
3
RT @NatureJapan: 都 英次郎教授らの腫瘍常在性細菌による抗がん効果に関する@natbme #OA #論文.Tumour-resident oncolytic bacteria trigger potent anticancer effects through s….
nature.com
Nature Biomedical Engineering - A tumour-derived consortium of Proteus mirabilis and Rhodopseudomonas palustris eradicated diverse tumours in immunocompromised and immunocompetent animal models via...
0
5
0
Happy to share our latest collaborative work with Prof. @loiannog for prompt detection of/quick recovery from collision with encoder-integrated #Tombo propeller drone. #EmbodiedAI_for_Drone.👉Paper:
0
2
15
🇮🇹ジェノバ大学との共同研究がRA-L誌に掲載されました!人間の行動認識に向け、センシング機能をインターフェース(ロボットも、ユーザも)にオフロードする、一次的な分散型マルチモーダルセンシング手法が提案できました。#EmbodiedAI #HRI .
ieeexplore.ieee.org
Human activity recognition (HAR) is fundamental in human-robot collaboration (HRC), enabling robots to respond to and dynamically adapt to human intentions. This paper introduces a HAR system...
Our collaborative work with colleagues from Univ. of Genoa is now published in RA-L! We propose a distributed multi-modal sensing method for human activity recognition, offloading sensing tasks to an interface #EmbodiedAI in #HRI. 👉
0
1
9
Our collaborative work with colleagues from Univ. of Genoa is now published in RA-L! We propose a distributed multi-modal sensing method for human activity recognition, offloading sensing tasks to an interface #EmbodiedAI in #HRI. 👉
0
2
18
RT @jaist_matsumura: JREC-IN Portal : Assistant Professor or Senior Lecturer Position Available in Graduate School of Advanced Science and….
jrecin.jst.go.jp
Career support portal site for all researchers and research staff who are pioneering innovation
0
3
0
RT @HaozhiQ: Our code has been released: Check out how we simulate the magnetic tactile skin and transfer it to th….
github.com
Contribute to jessicayin/tactile_skin_model development by creating an account on GitHub.
0
14
0
🎉お知らせ🎉(@IeeeTro).論文「視覚ベースの近接および触覚センシングによるロボットアームの設計・知覚・制御」 or「Vision-based Proximity and Tactile Sensing for Robot Arms: Design, Perception, and Control」が T-ROに採択されました!.公開版もぜひお楽しみに!.
Excited to announce that our paper on "Vision-based Proximity and Tactile Sensing for Robot Arms: Design, Perception, and Control" has been accepted by T-RO @IeeeTro. Stay tuned for the published version.
0
1
17
RT @loiannog: Thrilled to share that as of July 2025, I am joining @Berkeley_EECS as an Associate Professor! I am incredibly grateful to my….
0
4
0
Back to the where it all began? Vacuum gripper!.
Time for blog post number eight -- "Vacuum grippers versus robot hands." Here's the question: when humanoids can do everything a human can, will they take over the warehouse? No! Vacuum is awesome! Vacuum grippers are the wheels of manipulation!. It is only an five-minute.
0
0
4