In theory, yes it's impossible. You may have laws and licensing but it may be difficult to prove anything. However, there may also be technical things you can do to prevent certain usage. For example, an attack called GLAZE (https://glaze.cs.uchicago.edu/) can make image styles harder to mimic for many common text-to-image models. Nothing is foolproof though and new models can make this obsolete
I remember those AI image generators has filters that prevent them from generating anything if the term contains forbidden text such as "Adolf". Can be used as a hack to name your art so it won't get picked up by AI