RT @norihitoishida: ガウス過程とDeep Learningのいいとこ取りの手法、Neural Processを紹介しました Attention, Conv, GCNを組み込んだり、ロスのKLをWassersteinに変えたり、時間依存の潜在変数を導入したり、様々な工夫があり面白いです 資料は加筆して @breadhouse_semi にアップします 参加ありがとうございました! pic.twitter.com/DocOROkC4i
posted at 08:13:26
RT @karpathy: The transformer architecture of GPT upper bounds its ability at memorization. It cannot learn many algorithms due to the functional form of its forward pass, and spends a fixed compute per token - i.e. it can't "think for a while". Progress here critical, likely but non-trivial. pic.twitter.com/ULJdf35MJU
posted at 08:03:51
RT @matsuu: CUIベースでリッチな表組みを実現するPythonライブラリ。動画見る限りかなり良さそう。わいわい / “GitHub - willmcgugan/rich: Rich is a Python library for rich text and beautiful formatting in the terminal.” htn.to/2cCMqTDbAU
posted at 07:57:02
RT @hardmaru: I think an AI system's ability to perform tasks like casual inference and symbolic processing will eventually be learned or evolved as an emergent property that just happens to be useful for its environment. Not be something that is formally defined and hand-engineered by humans.
posted at 07:03:32