

I’ve found some strange timing differences in related tasks: for instance, live streaming to Twitch, I can’t get a stable 1080p60 stream out of the the graphics card - I will start dropping frames trying to go over 40fps - but only for live streaking, and the GPU is noehere near maxed out. I use a 2020 Mac mini with 8th gen i7 + BlackMagic eGPU.

mean if they got it to run on the 16-core dedicated Neural Engine in the M1, which Apple claims is over ten times faster at machine learning and AI stuff? (I’m curious about both, actually, cause I"m very tempted to get an M1 to just make my streaming problems go away entirely and and did seem very promising.) As in in general because the M1 is not x86 and emulating x86 is usually slow… so it would probably… actually run about the same if the GPU does all the heavy math, assuming that would all work just fine, and there existed emulation software that did this exact thing ralready… or did you.
