User notes that Microsoft Copilot+ PCs currently lack desktop models and expresses a desire for them, implying a need for more powerful or expandable systems for tasks like local AI inferencing.
@SwiftOnSecurity I bough my current desktop computer years ago and got one with 32G. Unfortunately that is all the motherboard can hold. I wish I had at least 64G now, and I don't use it for work; just for browsers, mostly, with many tabs and many extensions. I'd now say that general screwing around like I do, one should get 64G, and the motherboard should have a maximum ram of 256G. Given more money I might say buy 128G with room for expansion. This is assuming you aren't going for local AI inferencing. I want to be able to do AI inferencing locally. I am sure that would involve a new computer. My budget is very low. I know one needs to allocate far more to the video card if it's to be used for local LLMs, and there's the Apple M2 approach, and Microsoft Copilot+ PCs, which so far as I can tell don't have any desktop models and which use a QualComm Snapdragon processor, which suggest to me might cause compatibility problems with apps written for x86 machines. A discussion of how to best spend for doing local inferencing for LLMs on a budget would please me.