Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Surprising behaviour of enum #142

Open
p0nce opened this issue Apr 19, 2018 · 2 comments
Open

Surprising behaviour of enum #142

p0nce opened this issue Apr 19, 2018 · 2 comments

Comments

@p0nce
Copy link
Owner

p0nce commented Apr 19, 2018

eval 'enum A : int { init } int a() { return 0; } A b = a;'
/tmp/.rdmd-1000/eval.B5653537C403392A9A9B20B226E57659.d(18): Error: cannot implicitly convert expression a() of type int to A
Failed: ["/usr/bin/dmd", "-d", "-v", "-o-", "/tmp/.rdmd-1000/eval.B5653537C403392A9A9B20B226E57659.d", "-I/tmp/.rdmd-1000"]

reported by @WebFreaK

@WebFreak001
Copy link

WebFreak001 commented Apr 19, 2018

wrong webfreak mention lol, anyway here is a good usecase:


With enums you can essentially hack in typesafe aliases, which do not implicitly cast to each other, which prevents the users of your libraries from mixing up handles, for example ints in opengl.

// typedef types
enum GLTexture : int { init }
enum GLTexture1D : GLTexture { init = GLTexture.init }
enum GLTexture2D : GLTexture { init = GLTexture.init }
enum GLTexture3D : GLTexture { init = GLTexture.init }

// functions using typedefs
extern(C) void glGenTextures(GLsizei n, GLTexture* textures);
extern(C) void glBindTexture(GLenum target, GLTexture texture);

Here, glBindTexture can't be called with integers, but it can be called with any texture enum value:

GLTexture tex;
glGenTextures(1, &tex); // for pointers the type must exactly match, you couldn't pass a int* nor a GLTexture2D*, however you can always cast in these cases or define multiple overloads of your function
GLTexture2D foo = methodGeneratingATexture2D(); // you could even implicitly cast to GLTexture here, but not the other way around!
glBindTexture(GL_TEXTURE_2D, foo); // works, implicitly casts
glBindTexture(GL_TEXTURE_2D, cast(GLTexture) foo); // works too!
//glBindTexture(GL_TEXTURE_2D, 5); // nope!, don't allow to call with any arbitrary handle or integer
glBindTexture(GL_TEXTURE_2D, cast(GLTexture)5); // yes!, if the developer alreadys explicitly casts, they must know what they are doing

But why not use std.typecons : Typedef?

With Typedef the generated assembly will call some member functions of the typedef which could do all sorts of setup for the init values. With this enum trick the generated assembly is exactly the same as using ints everywhere, but the compiler will check any calls without any template bloat or performance decrease in the executable. Typedef also makes sure that you can't implicitly convert in both directions (which may actually be desired in some situations), while with this enum trick it only works in one direction. For example you can still write int a = GLTexture.init; here, because of the inheritance.

@MrSmith33
Copy link

Not surprizing behaviour at all. It is described here https://dlang.org/spec/enum.html#named_enums at point 17.1.5

A named enum member can be implicitly cast to its EnumBaseType, but EnumBaseType types cannot be implicitly cast to an enum type.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants