NinjaZ@infosec.pub to Technology@lemmy.worldEnglish · 8 hours agoChina scientists develop flash memory 10,000× faster than current techinterestingengineering.comexternal-linkmessage-square42fedilinkarrow-up1220arrow-down115cross-posted to: [email protected]
arrow-up1205arrow-down1external-linkChina scientists develop flash memory 10,000× faster than current techinterestingengineering.comNinjaZ@infosec.pub to Technology@lemmy.worldEnglish · 8 hours agomessage-square42fedilinkcross-posted to: [email protected]
minus-squareboonhet@lemm.eelinkfedilinkEnglisharrow-up2·2 hours ago What price point are you trying to hit? With regards to AI?. None tbh. With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.
minus-squaregravitas_deficiency@sh.itjust.workslinkfedilinkEnglisharrow-up1·1 hour agoYou’re willing to pay $none to have hardware ML support for local training and inference? Well, I’ll just say that you’re gonna get what you pay for.
minus-squarebassomitron@lemmy.worldlinkfedilinkEnglisharrow-up1·9 minutes agoNo, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.
With regards to AI?. None tbh.
With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.
You’re willing to pay $none to have hardware ML support for local training and inference?
Well, I’ll just say that you’re gonna get what you pay for.
No, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.