• boonhet@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    What price point are you trying to hit?

    With regards to AI?. None tbh.

    With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.

    • gravitas_deficiency@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      You’re willing to pay $none to have hardware ML support for local training and inference?

      Well, I’ll just say that you’re gonna get what you pay for.

      • bassomitron@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 minutes ago

        No, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.