News

Microsoft researchers have developed — and released — a hyper-efficient AI model that can run on CPUs, including Apple's M2.
Memory requirements are the most obvious advantage of reducing the complexity of a model's internal weights. The BitNet b1.58 ...
Microsoft’s new BitNet b1.58 model significantly reduces memory and energy requirements while matching the capabilities of ...
Microsoft’s model BitNet b1.58 2B4T is available on Hugging Face but doesn’t run on GPU and requires a proprietary framework.
Bug or migration strategy for New Outlook, we wonder Far be it from us to suggest Microsoft is trying to force people onto ...
In a post on Microsoft’s Support blog, the company warns that typing in a recent version of classic Outlook can lead to high ...
Microsoft (MSFT) researchers claim they’ve developed the largest-scale 1-bit AI model, also known as a “bitnet,” to date. Called BitNet b1.58 ...
Microsoft Research has introduced BitNet b1.58 2B4T, a new 2-billion parameter language model that uses only 1.58 bits per ...
Microsoft researchers claim they've developed ... Called BitNet b1.58 2B4T, it's openly available under an MIT license and can run on CPUs, including Apple's M2. Bitnets are essentially compressed ...