06-19-2018, 09:47 AM
From a C perspective it's fine (if we assumbe byte* is some 8 bit datatype), but in OpenCL everything is different. What I mean is that you have to find workarounds that do the same you try to do but are more easy for the compiler to understand.
Try could try to use uchar* instead in the function declaration. But it's more likely that you can not make use of 8 bit datatypes, since they do not exist natively on a GPU. There's on 32 bit registers and that's it. So for example you can use a combination out of div and mod and switch() in order to emulate what you do with the cast. Anyway, welcome to my world
Try could try to use uchar* instead in the function declaration. But it's more likely that you can not make use of 8 bit datatypes, since they do not exist natively on a GPU. There's on 32 bit registers and that's it. So for example you can use a combination out of div and mod and switch() in order to emulate what you do with the cast. Anyway, welcome to my world