PeteOpenGL2Tweak - Tweaker for PeteOpenGL2 plugin w/ GTE Accuracy Hack!

Discussion in 'PSX Plugin Questions & Troubleshooting' started by tapeq, Mar 26, 2014.

  1. TheDimensioner

    TheDimensioner New Member

    Messages:
    242
    Likes Received:
    60
    @Pogo _SoM "ResHack" is meant to increase (by multiplying) the internal resolution of PSX games beyond what Pete'sOGL2 2.9 allowed. If you set the "internal X and Y resolution" in the Pete'sOGL2 GUI to "2 and 3" respectively, it would be the same as setting the Tweaks .ini option "ResHack's" Mul X/Y to 4 and 8, respectively (someone correct me if I'm wrong XD). So in the ini, you can set numbers even higher, generally to match what monitor resolution you use, or to what your GPU can handle. It works as an "anti-aliasing" method, since plugins do not support common methods. This option is most useful for 3D games though, since 2D games will only get upscaled and generally sharper, and if too high, it will make things slow. But if you're using xBRZ, this will already "smooth and sharp" the pixels, so setting "ResHack" to something like Mul X/Y > 4/4 would be enough for 2D, if you have a xBRZ setting of 4 as well. You can also leave "ResHack" set as "0" and use the plugin's GUI internal resolution option instead, it works the same, but not higher than what I stated.
    Pogo _SoM likes this.
  2. unreal676

    unreal676 New Member

    Messages:
    39
    Likes Received:
    12
    @TheDimensioner I thought the way ResHack works is a bit more simple in implementation. If you did 2/3 it'd just increase the resolutions 2x and 3x, so 5/5 would be 5x height and 5x width. I didn't think it used the numbers 1,2,3 etc like the Plugins use.
    Pogo _SoM likes this.
  3. TheDimensioner

    TheDimensioner New Member

    Messages:
    242
    Likes Received:
    60
    @unreal676 Well, it actually multiplies the "render surface", I guess that's the size of the textures that create the polygons, maybe, I don't know XD. Using the "Enable Console Output" option under "CPU..." will enable the console. The TWEAK allows the console to show some extra information, and if "ResHack" is enabled (any number besides 0), it will show those numbers in the console, and also the "Detected render surface". Look what the console shows when I load Bushido Blade 2:
    Show Spoiler
    Captura de Tela (20).png
    "ResHack" is MulX = 6 and MulY = 5, so the "Detected render surface" is "6144x2560". I use those numbers because it works better for 1080p than "round" textures. The "internal X/Y" of Pete'sOGL2 GUI set to 2 and 3 respectively will be, like I said, 4 and 8 for "ResHack" because it gives a "Detected render surface" of "4096x4096", so 4k textures XD. The .ini's default is "MulX/Y > 8/8, which is "8192x4096" according to the console, so it's already higher than what Pete'sOGL2 2.9 "high" settings allow. So setting "ResHack to 8 and 16 will render 8K textures, which is probably too much for PSX games, and also the maximum "texture size" my video card (9600GT 1GB) can render XD.

    I also thought initially that the GUI's numbers were similar to reshack, but after looking at the console, I saw the "high" settings for the plugin were higher than I thought. Now I feel ashamed for straining my old GeForce 6200 256MB GPU so much using those settings. I also played on a 15" 1024x768 CRT monitor, so it was stupid of me... That card had a good run though, lasted for about 3 years, and it was already used when I had bought it XD.
    Pogo _SoM likes this.
  4. unreal676

    unreal676 New Member

    Messages:
    39
    Likes Received:
    12
  5. TheDimensioner

    TheDimensioner New Member

    Messages:
    242
    Likes Received:
    60
    @unreal676 I actually don't know about it myself, I'm just using whatever information the console shows. I think it's related to how graphics cards work: resolution is the amount of horizontal and vertical lines it draws each frame, but there's also the VRAM part that stores whatever textures before the GPU can render them... I know nothing about how GPUs work, so I can't explain that in depth XD. So "reshack" actually makes textures bigger and sharper, while keeping the monitor's resolution the same, maybe like what "DSR" do these days. All I know is that "reshack" makes things prettier, but too much of it can kill your GPU XD.
    Last edited: Aug 29, 2016
    Pogo _SoM likes this.
  6. Pogo _SoM

    Pogo _SoM New Member

    Messages:
    3
    Likes Received:
    0
    Well, thanks for the help. I think I can savely reduce the resolution then, since Im playing this game with the laptop and with these high numbers its just too slow. Unfortunately I still have problems with the fire and water effects. Big slowdown in every town or temple with fire or water.

    Oh btw. Good to see such a nice and helpful community.
  7. TheDimensioner

    TheDimensioner New Member

    Messages:
    242
    Likes Received:
    60
    @Pogo _SoM Yeah, I had slowdowns with this game, when I played in my old GPU XD. Framebuffer effects were always a killer for me, even before xBRZ. I remember that in Tales of Phantasia there's that elf forest with water and bridges, and whenever I entered that, I had to hit F7 to change to the software plugin (played on ePSXe 1.9.25), because in software mode, FBEs would play normally. Now, with xBRZ, I had the same problem in some games, but "FastFBE" solved that, mostly. You can try enabling "FastFBE", if you can bare with some of the textures not being xBRZ, but, only Tweaks 2.3 work properly with it. In 2.4, only some FBEs will be "fast", and the problem with textures that aren't xBRZ is still there. You can try changing the "TextureCacheSize" option in the .ini to see if the slowdown diminishes with high xBRZ, but I don't know if it will work (I leave it at "256", but if there's any difference from the default "128" in the games I play, it's barely noticeable).
  8. unreal676

    unreal676 New Member

    Messages:
    39
    Likes Received:
    12
    @TheDimensioner The way Reshack works is confusing, not the results :3. I originally thought it did something in much simpler terms, simply just rendering the game at a higher internal resolution and displaying that in the screen, whatever your resolution may be.

    PS1 games are what, natively they're really small right? 320x240 or something crazy like that. If at 1 and 1 on the Plugin, I wonder what that'd give (goes to check)... Odd. For some reason I'm not seeing the resolution anymore, it originally showed a weird resolution, it was 1024x512, but that literally makes no sense... So I don't know. We'll need tapeq to chime in with this again.

    It'd make more sense to me if all it did was multiply the native resolution against the reshack numbers, so at 8x12 you'd get 2560x2880 or something like that.

    Though you're right, resHack and DSR are pretty much the same thing, as is any emulation and DSR or AMD's equivalent, it just forces the internal rendering resolution to be greater than your monitor and then it's downsampled into the screen view, be it whatever you have the setting at.
  9. TheDimensioner

    TheDimensioner New Member

    Messages:
    242
    Likes Received:
    60
    @unreal676 In the almost very beginning of this thread (damn, it's been more than a year already, my english seemed to be better back then :oops:), @tapeq had posted sort of an "explanation" of ResHack's "resolutions":

    http://ngemu.com/threads/peteopengl...gte-accuracy-hack.160319/page-17#post-2346761

    @Hyllian, the creator of xBR, also had posted his "perfect" xBR(Z) rendition play:

    http://ngemu.com/threads/peteopengl...gte-accuracy-hack.160319/page-18#post-2347673

    Using MulX = 3 and MulY = 3 and a windowed resolution of 960x720. Multiplying the 320x240 (that would be the "native" resolution of most PSX games) by 3, would also give 960x720. I use ResHack Mul X/Y = 6/5 because multiplying 320x240 by those gives me 1920x1200, the closest match to 1920x1080, which is my monitor's resolution. I see no difference at all using higher numbers for 3D games, while using lower numbers will show very noticeable aliasing. So maybe "ResHack" really increases the internal resolution, while also increasing the texture size. If I use anything higher than Mul X/Y = 8/16, there will automatically be an error telling me that my GPUs VRAM is loaded and there might be rendering issues and/or crashing. Like I said, 8/16 is 8192x8192, and a GPU spec software tells me that my graphics card "Max Texture Size" is 8192x8192, so there you go XD.

    Using "ResHack" 6/5 and xBRz x5 at 1920x1080 resolution and deposterize does look great, almost a 1:1 look/feel for playing any game XD. But the first FBE in the corner will immediately kill the experience :p.
    unreal676 likes this.
  10. tapeq

    tapeq Member

    Messages:
    531
    Likes Received:
    72
    Wrong, max PSX resolution is 640x480 (I have not seen it in games trough, but it's on specs, some games use 512x240), so surface must be big enough to store all pixels, so smallest one that covers this is 1024x512. When you set Render to Texture then surface is a texture :p

    List of PSX resolutions: http://problemkaputt.de/psx-spx.htm#gpuvideomemoryvram

    PS: Most modern GPUs max texture size for OpenGL is 16384x16384 or even more.
    PS2: Also non-power of 2 textures/surfaces are slower and not supported on same hardware.
    Last edited: Aug 30, 2016
    unreal676 likes this.
  11. unreal676

    unreal676 New Member

    Messages:
    39
    Likes Received:
    12
    Well, now I understand why you used the numbers you did! Ahahaha, I see, very interesting. So there really isn't much of a need to push those numbers up really high unless you play on high resolution monitors to begin with, thanks!

    And yes, I know that's the max, but I didn't think any games used it either :3.
  12. TheDimensioner

    TheDimensioner New Member

    Messages:
    242
    Likes Received:
    60
    @tapeq The confusing part really was the "render surface". So what we see on screen, is like a big texture, and "reshack" simply make that texture bigger to make it prettier on modern stuff? Well, I was just making guesses anyways XD.

    And my GPU is farther and farther from "modern" these days XD. My GeForce 9600GT 1GB is a great GPU, some things still run great on it, but mostly at 1280x720. Emulators can run at 1920x1080, but I generally have to lower the internal res to mostly 2x (Dolphin and PCSX2). With PSX and N64 emulators, I can use internal resolutions much higher, but like I said, if I use reshack higher than 8/16, I really get an error related to VRAM. You can see here that my "Max Texture Size" really is 8192x8192. Uhh, I need a new GPU XD.

    http://www.geeks3d.com/20100120/nvidia-r196-21-opengl-opencl-and-cuda-details-geforce-9600-gt/
  13. tapeq

    tapeq Member

    Messages:
    531
    Likes Received:
    72
    @TheDimensioner
    To render a frame you need to do a much stuff, simplest frame processing is to set all vertex/primitive (primitive = point, line, triangle, polygon, etc.) data, then process geometry like back face culling, clipping etc, next apply pixel data ie. textures, and after alpha and depth tests render those pixels to surface, then display it. (I skipped some steps to simplify this). In most cases you use two surfaces ie. one is processing while second is displayed (double buffering).

    So simply, surface is a memory that stores pixel data (so images are also surfaces!), render target surface is in most case special surface that stores rendered pixels to display to user.
    Additionality, as render to texture technique was created, then some games use effects like mirrors that are done by rendering to target texture (ie. mirror) part of screen (after additional processing to make it look like mirrored image), then render it back with "working" mirror to render target :)

    Ok, guys, class dismissed :p
    Last edited: Aug 31, 2016
    TheDimensioner likes this.
  14. TheDimensioner

    TheDimensioner New Member

    Messages:
    242
    Likes Received:
    60
    Uhm, thank you for the lesson, professor @tapeq XD.
  15. Hiteryan

    Hiteryan New Member

    Messages:
    7
    Likes Received:
    0
    Hi everyone!

    First off I just want to thank everyone involved especially tapeq for all the amazing work, I've been off the PSX emulation scene for quite a few years now so I'm delighted to see these great improvements. I've used ePSXe for a long time, but the widescreen capabilities (which I had no idea of) just forced me to move to PCSXR.

    All is well, I've been trying some games in my native 2560x1440 resolution and they run wonderfully. But I can't seem to make the GTE accuracy hack work? It's enabled on the gpuPeteOpenGL2Tweak.ini file (I'm using that plugin) but I still get jaggies all over ;__; am I missing something out?
  16. TheDimensioner

    TheDimensioner New Member

    Messages:
    242
    Likes Received:
    60
    @Hiteryan The "GTE Accuracy Hack" included in the "main branch" of PCSX-R and ePSXe, is an old implementation, causing bugs in 2D textures and also "holes" in some games. Try PCSXR-PGXP, it has many improvements over the old GTE hack, and it also includes a version of tapeq's Tweaks, with modifications to work specifically with PGXP:

    http://ngemu.com/threads/pcsxr-pgxp.186369/

    About the "jaggies", if you mean aliasing (staircase effect), raise "ResHack" option in the ini to something like MulX = 8, MulY = 16. There was sort of a discussion up here on this page, so you might get some info about it XD. This will increase the internal resolution beyond what even Pete'sOGL2 2.9 could achieve, and the "render surface" will be 8192x8192, 8K textures :p. If you still see aliasing, then try using a shader. There are many shaders with different effects, but I personally use "FXAA + natural colors" shader, which removes "jaggies" on 3D models and also saturates the color to something like a CRT, which naturally enhances color. I've tried using my video driver to add anti-aliasing, but it never worked for me, although others say it does, so you could try that as well.
  17. Hiteryan

    Hiteryan New Member

    Messages:
    7
    Likes Received:
    0
    Sorry my bad, I didn't mean jaggies (I don't really see any noticeable aliasing so no complaints on that area), what I meant was really just referring to the old GTE accuracy hack. I'm still getting the wobblyness from the textures/models due to fixed point math. I am using PCSXR-PGXP yes, and I've enabled all the 3 options on the PGXP config menu, plus set its mode to Memory only.
  18. zackboy

    zackboy New Member

    Messages:
    21
    Likes Received:
    0
    With Nvidia card we can also configure AA and FXAA for ePSXe.exe and pcsxr-pgxp.exe/pcsxr.exe
    MulX = 8, MulY = 12 (as max recommanded) + FXAA + AA 8x makes games wonderful
    But I think he talks about the old GTE Accuracy (in ePSXe you have to activate it in gpuPeteOpenGL2Tweak.ini and in the GTE hack with sub-pixel precision, ePSXe 2.x require)
    I think ePSXe team want include PGXP only when the WIP status will be removed
  19. TheDimensioner

    TheDimensioner New Member

    Messages:
    242
    Likes Received:
    60
    Well, like I said, using the driver never worked for me (NVIDIA GPU here, although very old 9600GT XD). But raising "ResHack" to something like MulX/Y 8/16 already makes games very beautiful in my 1920x1080 monitor, although extremely slow.

    I think he just confused what the GTE hack does. It's meant to "fix" PSX wobbliness and texture warping, not "jaggies", that's for anti-aliasing. And now that I've got used with PCSXR-R(PGXP), I don't think I really want PGXP to be implemented to ePSXe, since the direction the team is taking right now, seems to be phasing-out the PC version in favor of the android version. The latest Windows versions have many bugs, and the improvements weren't that noticeable. I think implementation into Libreto-Mednafen would be most enjoyed, since that's where most devs think the PSX emulation future is at. It would be worth finally learning how to use Retroarch when that happens (my last attempts were catastrophic XD).
  20. zackboy

    zackboy New Member

    Messages:
    21
    Likes Received:
    0
    Oh I'm Sorry for You ^^
    Yes maybe you're right. (I've read jitter like an idiot)

    I personally still prefer ePSXe for his speed compared to PCSXR but yes today PCSXR is now the best PS1 emulator (even if I don't have any bug on ePSXe)..
    I don't really agree with all those people who say mednafen is really good, it's cool yes but for the moment just set the internal res at 4x make the games really slow (I have an I5 4670k + GTX970..) Like GPUBladeSoft (before the two last versions who have disabled this feature), About that, IMO edgbla have many good ideas for his plugin, maybe a little ambitious but good, and some problems from OGL2 are fixed (precise bilinear textures: http://forum.emu-russia.net/viewtopic.php?p=15484#p15484 ) there is also good features (unfinished z-buffer: http://forum.emu-russia.net/viewtopic.php?p=17449#p17449 and perspective correction without bugs in spyro http://forum.emu-russia.net/viewtopic.php?p=23684#p23684).

    By the way, do you know why some textures aren't xBRZified and in nearest even with the FastFBE disabled ?
    Last edited: Sep 3, 2016

Share This Page