Loading...
Loading...
Compare original and translation side by side
paper_search(query="transformer attention mechanism", sources=["arxiv", "semantic_scholar"])paper_search(query="transformer attention mechanism", sources=["arxiv", "semantic_scholar"])semantic_scholar_citations(paper_id="...")semantic_scholar_citations(paper_id="...")arxiv_search(query="quantum computing", max_results=10)arxiv_search(query="quantum computing", max_results=10)/Users/alice/worlds/l/mcp_servers.jsonresearch-hub-mcpzotero-mcpopenalex-mcpcrossref-mcp"server-name": {
"command": "npx",
"args": ["-y", "server-package-name"]
}/Users/alice/worlds/l/mcp_servers.jsonresearch-hub-mcpzotero-mcpopenalex-mcpcrossref-mcp"server-name": {
"command": "npx",
"args": ["-y", "server-package-name"]
}algorithmsalgorithmsacademic-research (−) + SDF.Ch6 (+) + [balancer] (○) = 0academic-research (−) + SDF.Ch6 (+) + [balancer] (○) = 0Trit: 0 (ERGODIC)
Home: Prof
Poly Op: ⊗
Kan Role: Adj
Color: #26D826Trit: 0 (ERGODIC)
Home: Prof
Poly Op: ⊗
Kan Role: Adj
Color: #26D826(-1) + (0) + (+1) ≡ 0 (mod 3)(-1) + (0) + (+1) ≡ 0 (mod 3)