• @Hawk
    link
    English
    68 months ago

    They were inferencing a cnn on a mobile device? I have no clue but that would be costly battery wise at least.

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      8 months ago

      They’ve been doing ML locally on devices for like a decade. Since way before all the AI hype. They’ve had dedicated ML inference cores in their chips for a long time too which helps the battery life situation.

      • @Hawk
        link
        English
        18 months ago

        It couldn’t quite be a decade, a decade ago we only just had the vgg — but sure, broad strokes, they’ve been doing local stuff, cool.