>>12866>They will not, and will in fact be the worst perpetrator yet seenDo you mean to say Ai companies will do more copy-monopoly impositions ?
I very much doubt that. Copy-monopolies only work for static data. The Ai systems aren't static, the training data sets, the model weights, and so on, all of it is in continuous flux. There's nothing in there that is fixed enough to turn it into a definable "legal object".
These generative AIs have a processing step that uses randomized noise, so even identical instances of an AI system are unlikely to produce identical results. If you try to apply copy-monopoly logic to this, you'd get a mad competition to copy-rape all the things. For example Amazon would be using it's large array of cloud computers to generate every possible book, that is somewhat coherent text. Anybody who tries to author a new book, gets told "F.U. already generated and copy-raped". Not just books, ALL forms of expression would become copy-monopoly infringement. Copy-monopolies would become too absurd to continue existing.
If I were trying to make a AI company that i.d.k draws pictures for example. I wouldn't download all the images from DeviantArt and then computationally brute-force those. It works well enough but it's not the best method. You hire a painter, to paint something from a photo. And you feed the Ai system with the data about their graphic-tablet-inputs and their gaze (which part of the photo they focus their attention on). Then you only need to apply brute force-processing on tiny chunks of an image. Instead of using tens of gigabytes of RAM-memory you use hundreds of megabytes of CACHE-memory. Which is much faster and lower on power consumption. The "chunking method" also needs smaller training data-sets. But it requires you to hire people and strap them into a fancy set-up in order to make them "teach" the AI system the entire step by step process of painting a picture, instead of just showing it the final result. With this optimization you can condense a "AI-thing" into a reasonably priced pcie-card, with a relatively low power consumption and sell it as a commodity. Artists and everybody else who inputs "intelligence chunks" would become wage-workers in a factory, that make pieces of silicon that are "a skill in a can". Over time the many dedicated chips get consolidated into a generalized chip.
Stuffing AI accelerators into a big server farm, that offer AI services, won't scale up. We tried that scheme already. In the 80s the vision was that everybody was going to have a relatively dumb device that handles the interface and then all the serious compute tasks were going to get offloaded to a mainframe supercomputer. But it turned out that the economies of scale that can be achieved from mass-producing low cost end-user devices just can't be matched.
The copy-monopoly thing isn't really relevant for any of this.