MoltenGL crash when launching in Unity?


Viewing 7 posts - 1 through 7 (of 7 total)
  • Author
    Posts
  • #1289
    chenditc
    Participant

      I have a unity plugin which use OpenGL SE 2 for rendering in Unity. Now there is a requirement to use Metal Graphic API in order to use AR Kit in iOS 11.

      I tried to install MoltenGL and force Unity to use Metal API, but my plugin seems not able to draw on the screen using OpenGL API.

      I can see following trace:

      -> applicationDidBecomeActive()
      [mgl-info] You require a MoltenGL license for the OpenGL ES 2.0 Core feature set. Reverting to evaluation mode.
      2017-07-18 14:41:59.524044+0800 ProductName[335:16587] [mgl-info] You require a MoltenGL license for the OpenGL ES 2.0 Core feature set. Reverting to evaluation mode.

      Then I crashed at

      0x10148f708 <+300>: bl 0x101479e0c ; ::GetDriverString() at ApiGLES.cpp:2462
      0x10148f70c <+304>: mov x23, x0
      0x10148f710 <+308>: bl 0x101a469a4 ; symbol stub for: strlen
      0x10148f714 <+312>: mov x2, x0

      Is there support for Unity in this plugin?

      #1290
      chenditc
      Participant

        I would like to achieve either:

        1. When Unity initialized with OpenGL 2 graphic API, but device has a metal enabled chip, the underlying work is happening using Metal API.

        Or

        2. When Unity initialized with Metal graphic API, the customized Open GL 2 rendering code still works.

        #1291
        Bill Hollings
        Keymaster

          @chenditc

          I’m afraid that we do not directly support MoltenGL integrated to a Unity plug-in.

          Be sure to review the installation instructions in the README_MoltenGL_UserGuide.md document.

          It is important that all OpenGL ES calls made by your plugin (and Unity) are redirected through the MoltenGL headers. It is also important that the UIView that is used is backed by a CAMetalLayer.

          …Bill

          #1292
          chenditc
          Participant

            I think the Unity OpenGL ES call is using the MoltenGL implementation.

            I poked around the code called by Unity. Seems Unity is trying to call “glGetString” to get some driver related information and get a null pointer back.

            The call looks something like:

            const GLenum GL_VENDOR = 0x1F00;
            const GLubyte* stringPointer0 = glGetString(GL_VENDOR + 0);
            const GLubyte* stringPointer1 = glGetString(GL_VENDOR + 1);
            const GLubyte* stringPointer2 = glGetString(GL_VENDOR + 2);

            Are you able to pin point the problem using this info? What is the implementation of glGetString?

            #1293
            chenditc
            Participant

              @bill

              I think the Unity OpenGL ES call is using the MoltenGL implementation.

              I poked around the code called by Unity. Seems Unity is trying to call “glGetString” to get some driver related information and get a null pointer back.

              The call looks something like:

              const GLenum GL_VENDOR = 0x1F00;
              const GLubyte* stringPointer0 = glGetString(GL_VENDOR + 0);
              const GLubyte* stringPointer1 = glGetString(GL_VENDOR + 1);
              const GLubyte* stringPointer2 = glGetString(GL_VENDOR + 2);

              Are you able to pin point the problem using this info? What is the implementation of glGetString?

              #1294
              chenditc
              Participant

                @bill

                When I run the app on a A6 chip iphone 5, it works fine, I guess that’s because it just redirect the glGetString call to native OpenGL ES library.

                Same glGetString call gets string:
                Renderer: PowerVR SGX 543
                Vendor: Imagination Technologies
                Version: OpenGL ES 2.0 IMGSGX543-128

                #1295
                Bill Hollings
                Keymaster

                  @chenditc

                  The glString() function under MoltenGL operates as expected once the EAGLContext has been established.

                  Your code…

                  const GLenum gl_VENDOR = 0x1F00;
                  const GLubyte* stringPointer0 = glGetString(gl_VENDOR + 0);
                  const GLubyte* stringPointer1 = glGetString(gl_VENDOR + 1);
                  const GLubyte* stringPointer2 = glGetString(gl_VENDOR + 2);
                  
                  printf("Vendor + 0: %s\n", stringPointer0);
                  printf("Vendor + 1: %s\n", stringPointer1);
                  printf("Vendor + 2: %s\n", stringPointer2);

                  produces the following output…

                  Vendor + 0: The Brenwill Workshop Ltd.
                  Vendor + 1: MoltenGL
                  Vendor + 2: OpenGL ES 2.0 MoltenGL 0.18.1 (build 0)

                  However…if the EAGLContext has not been established yet…the missing context will cause all GLES function calls to crash.

                  In your plugin…is it possible for you to ensure the EAGLContext is established before the glString() calls are made?

                  EAGLContext* ctx = [[EAGLContext alloc] initWithAPI: kEAGLRenderingAPIOpenGLES2];
                  [EAGLContext setCurrentContext: ctx];

                  …Bill

                Viewing 7 posts - 1 through 7 (of 7 total)
                • The forum ‘MoltenGL Support’ is closed to new topics and replies.