🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

[glEnable(GL_COLOR_MATERIAL]

Started by
17 comments, last by JoeJ 2 years, 2 months ago

ogldev1 said:
the z axis isn't visible and the coordinate system doesn't seem to be attached to the object however I drew it right after drawing the object.

This surely is because your mvp matrix was given only to the shader program.

So before drawing the axis, you need to load the projection and modelview by using the legacy glLoadMatrix functions.
Or you write second shader for untextured models, so you could use uniforms as well.

ogldev1 said:
When the object rotates, the axes rotate too but now they seem to be detached from the object.

That's confusing (together with the fact z axis isn't drawn).
Maybe you did not really rotate the model, but just the projection (camera), which you did give to GL elsewhere in the code so it still affects the axis.

To minimize the confusion, you can limit GPU matrix use to having only the camera projection, but no model transform.
To draw the axis this way, you would load the camera projection, set modelview to identity, and transform the axis vertices yourself on C++ side.
To proof correctness, you could first try to draw the object vertices as white points and see if they match.
(Exercise makes sense only if you're familiar with transformations. But if so, you can make sure stuff is drawn at the right space as intended.)

Advertisement

A little shortcut: Just use a shader all the time. Write a new simple shader that takes in a uniform Vec3 color. Draw one axis, send uniform, draw one axis, send uniform. Then none of the old style state machine stuff matters.

Just FYI: OpenGL4 and on don't let you use any of that fixed function lighting/color stuff.

NBA2K, Madden, Maneater, Killing Floor, Sims http://www.pawlowskipinball.com/pinballeternal

@JoeJ @dpadam450 Guys, I'm still learning how to work using shaders with every rendered object. I was going to do what you both suggested but things got messed up.

The example I was working on and the one I shared with you is an example that already exists in Qt but I want to use it and add some functions. In fact I made a copy of the whole example and opened it in Qt.

However when I wanted to recreate the same example by myself and by that I mean:

I started a new project, added classes and resources file in which I added the vertex and fragment files. I copied the exact same code in each file (.h and.cpp) from the original example and I didn't change any single line.

I then ran the code and All I kept getting is : shader program is not linked. However in the previous example I didn't get this error.

I also ran it using debugger and a breakpoint point stops everytime at a line in QOpenglfunction.h file which I didn't develop by myself ( breakpoint shown in the image below):

Also, I checked wether I forgot some code line or wrote something wrong but everything seems to be correct. Still All I get is shader program is not linked.

What really confuses me is that when I made a copy of the original project developped by Qt and put it in another folder then opened it and ran it, there were no erros however when I recreated it manually, it didn't work.

I don't want to modify the copy that works because it works and I'm afraid I ruin it.

The example I recreated is the example I want to try things on, like creating other shaders.

Now I'm totally stuck, It's been like 5 hours I'm looking for a solution.

I added error=program.log() to see why the shader is not linked and I got this:

QOpenGLShader: Unable to open file ":/vshader.glsl"

QOpenGLShader: Unable to open file ":/fshader.glsl"

QOpenGLShader::link: Link called without any attached shader objects.

QOpenGLShader::link: Link called without any attached shader objects.

error "Link called without any attached shader objects.\n"

I don't get it; it couldn't open the files however the vertex and fragment files exists in the same folder as the project. how come it couldn't access to them.

ogldev1 said:
QOpenGLShader: Unable to open file

Well, looks like your exe simply did not find the shader file?
I'm not sure if “:/" is a valid path. If so it probably means the path where the exe is.
Visual Studio has settings on where your debug/release exe is generated/executed, and all those paths can differ. Looks like you use some other IDE, but it's probably the same.
So yes, directory paths is another annoying source of constant confusion an issues.
The easy (but not the greatest) way is to hard code your paths. E.g. You have “C:\dev\myProject\shaders”, and you make sure you use this everywhere when you load a shader. So it does not matter where the exe is.
It's also a good idea to add asserts on every file loading function. So if files are not found, you notice that quickly.

JoeJ said:
To draw the axis this way, you would load the camera projection, set modelview to identity, and transform the axis vertices yourself on C++ side.

Okay so as a first attempt, I tried to Load the projection matrix, set modelview to identity, draw the coordinate system however the axes don't show up and it seems there is something missing in the projection matrix.

In fact here is what I've done:

this is my resizeGL where I've set up the parameters of the projection matrix:

void MainWidget::resizeGL(int w, int h)

{

qDebug()<<__func__;

// Calculate aspect ratio

qreal aspect = qreal(w) / qreal(h ? h : 1);

// Set near plane to 3.0, far plane to 7.0, field of view 45 degrees

const qreal zNear = 3.0, zFar = 7.0, fov = 45.0;

// Reset projection

projection.setToIdentity();

// Set perspective projection

projection.perspective(fov, aspect, zNear, zFar);

}

And this is paintGL where I drew the object, then disabled the shader program, loaded projection, transformed modelview to identity and drew the coordinate system by calling draw_frame();

void MainWidget::paintGL()

{

qDebug()<<__func__;

// Clear color and depth buffer

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

program.bind();

texture->bind();

//! [6]

// Calculate model view transformation

QMatrix4x4 matrix;

matrix.translate(0.0, 0.0, -5.0);

matrix.scale(0.5,0.5,0.5);

program.setUniformValue("mvp_matrix", projection * matrix);

program.setUniformValue("texture", 0);

// Draw cube geometry

geometries->drawCubeGeometry(&program);

texture->release();

program.release();

glLoadMatrixf(projection.constData());

// Reset projection

projection.setToIdentity();

glLoadMatrixf(matrix.constData());

matrix.setToIdentity();

draw_frame();

}

Should I at this stage, set the projection matrix parameters again as I did in the resizeGL function ? I mean specifying the fov, aspect ratio, znear and zfar ? because I tried that right before drawing the frame even tho it didn't make sense and I knew it. The result was a disaster; the object stretched and still the axes didn't show up.

Also guys, I wanted to thank you for your patience and help.

[And yes when I hard coded the paths, it did work]

ogldev1 said:
glLoadMatrixf(matrix.constData());

It's not clear which matrix mode is currently assumed from the API.
Usually you set this like so, iirc:

glSetMatrixMode(GL_PROJECTION);
glGloadMatrix(projection);
glSetMatrixMode(GL_MODEL_VIEW);
glGloadMatrix(matrix);

But it would be more future proof to follow @dpadam450 s advise of using shaders for everything, agree to that.

Please make use of code block format next time, so your code is more readable ; )

@JoeJ @dpadam450 Hello again, I wanted to thank you for your precious help and patience.

Also, now I wanted to change the color of my background'scene by adding a texture ( an image that I uploaded from google). So I drew a quad and then I applied the texture on it however I've been facing two problems:

1- the image doesn't cover the whole scene so I changed the coordinates of the quad to make it cover the whole widget but the resolution of the image isn't good, the image is sooooo fuzzy. However two filters were used:

texture1 ->setMinificationFilter(QOpenGLTexture::Nearest);

texture1->setMagnificationFilter(QOpenGLTexture::Linear);

2-I tried to load other images and it seems that only a part of the image is covering the quad.

for example: here is the initial image (the texture I want to apply):

And this is what I get in my app:

looks like it's taking a portion of the image.

I think it has something to do with the texture's coordinates maybe ?

here is the code used to draw the quad: QVector3D for the quad's coordinates and QVector2D for the texture's coordinates:

{QVector3D(-1.0f, -1.0f, 0.0f), QVector2D(0.0f, 0.0f)},

{QVector3D( 1.0f, -1.0f, 0.0f), QVector2D(0.33f, 0.0f)},

{QVector3D(-1.0f, 1.0f, 0.0f), QVector2D(0.0f, 0.5f)},

{QVector3D( 1.0f, 1.0f, 0.0f), QVector2D(0.33f, 0.5f)},}

Sorry for not using the code format the previous time, I used it this time.

Any help would be appreciated, thank you.

ogldev1 said:
the image doesn't cover the whole scene

You need an environment map, which is one or multiple images to represent the whole environment, not just a frustum like photos do.
Most widely used is a skybox, that's 6 images, and you need to apply them in correct order to all faces of a cube. Then you render this cube centered at the camera position, and you get the impression of an environment at infinite distance.

https://learnopengl.com/Advanced-OpenGL/Cubemaps

This topic is closed to new replies.

Advertisement