#opengl wrapper
Explore tagged Tumblr posts
Text
Writeup: The Great(?) OpenGL Wrapper Race
Somehow, I always find myself gravitating back to the S3 ViRGE/DX.
This time around, rather than placing total focus on the ViRGE itself, we'll be taking a look at some OpenGL wrappers!
This writeup will be updated along the way as more videos release.
The setup is as follows:
Matsonic MS7102C
Windows 98SE RTM with the following patches/update: KernelEX 4.5.2, NUSB v3.3e, Windows Installer 2.0, DirectX 9.0c
Intel Pentium III (Coppermine) @ 750MHz
S3 ViRGE/DX @50MHz w/4MB EDO DRAM (using the S3 Virge "SuperUni" drivers)
256MB PC-133 SDRAM (Kingston KVR133X64C3/512) (System can only handle 256MB per DIMM)
Sound Blaster AWE32 CT2760
Some random 5GB Hitachi 2.5" HDD that I "borrowed" from a very dead laptop
Java 6 (1.6.0) build 105/Java 6u0
Wrappers to be tested:
S3Mesa - a wrapper based on the Mesa project. It's a full OpenGL implementation sitting on top of S3D and Direct3D, but from the available source code appears to be missing some functionality and is quite unstable.
AltOGL - also a wrapper based on the Mesa project, but relies solely on Direct3D. It is similarly missing some functionality, but is much more widely compatible with cards beyond the Virge thanks to its lack of reliance on S3's proprietary API.
Techland S3D - one of the many wrappers made by Techland for their Quake II engine-based "Crime Cities" game. Although it like S3's own GLQuake wrappers only implements as much of the API as is needed by the engine, it still implements far more features than S3's DX5 and DX6-based wrappers, of which are not tested in this wrapper race.
Techland D3D - like AltOGL, Techland's D3D counterpart to their S3D wrapper implements a subset of the OpenGL spec, but still enough to be compatible with a number of titles beyond Quake II engine games.
GLDirect 1.x - A very early version of GLDirect. There exists a license code for this version floating around on the internet that appears to have been used for internal testing by Shuttle, a PC manufacturer that's largely fallen out of relevance and mainly does industrial PCs nowadays.
GLDirect 3.0.0 - One of the last versions of GLDirect to support hardware acceleration on DX6-class graphics cards.
Things tested
WGLGears
ClassiCube 1.3.5 and 1.3.6
Minecraft: Indev in-20091223-1459, Alpha 1.0.4, Beta 1.7.3 (with and without Optifine), Release 1.5.2
Tux Racer
GL Excess benchmark
Half Life 1 v1.1.1.1 KingSoft NoSteam edition
Findings
GLDirect 1.01
OpenGL Version String
Vendor: SciTech Software, Inc. Renderer: GLDirect S3 Inc. ViRGE/DX/GX Version: 1.2 Mesa 3.1
Textures do not work at all in every test case besides Half Life.
ClassiCube 1.3.5 and 1.3.6 both fail to render any terrain beyond a greyish horizon and the outlines of blocks. All blocks, items, and text that are able to be rendered are pure white in color and have no textures applied.
Minecraft Indev in-20091223-1459 and Beta 1.7.3 with Optifine crash upon world-load
Minecraft Alpha 1.0.4, Beta 1.7.3, and Release 1.5.2 all crash upon launch.
Tux Racer is able to render 2D textures such as text and graphics, but flickers INTENSELY and is a seizure risk. Beyond this, however, the game will only render a solid white screen.
Half Life launches and runs, but at a terrible 4 FPS. The game lags hard enough that the tram in the opening area of the game is frozen where it is, preventing the player from accessing anything beyond the intro cutscene.
GL Excess crashes instantly.
Performance
GLGears: ~76 FPS
ClassiCube 1.3.5/1.3.6: Unknown; game fails to render
Minecraft in-20091223-1459: Unknown; world crash
Minecraft Alpha 1.0.4: Unknown; crash on game launch
Minecraft Beta 1.7.3: Unknown; crash on game launch
Minecraft Beta 1.7.3 w/Optifine: Unknown; world crash
Minecraft Release 1.5.2: Unknown; crash on game launch
Tux Racer: Unknown; game fails to render in a VERY seizure-inducing way
Half Life: ~4 FPS; gameplay outside of training room is broken
GL Excess: Unknown; instant crash
GLDirect 2.00
youtube
From here on, GLDirect is split between "Game" and "CAD" wrappers, denoted by either a "G" or a "C" after the version number where written in this writeup.
OpenGL Version String (2.00C)
Vendor: SciTech Software, Inc. Renderer: GLDirect S3 Inc. ViRGE/DX/GX Version: 1.2 Mesa 3.1
OpenGL Version String (2.00G)
Vendor: SciTech Software, Inc. Renderer: GLDirect Version: 1.1
GLDirect 2.00C likes to complain about insufficient color precision.
Changing the color precision from 24-bit up to the maximum 32-bit color does absolutely nothing.
The CAD wrapper very clearly is intended for non-gaming workloads given how easily it crashes things. However, it is strange that it is labeled as a "maximum compatibility" wrapper/driver.
I am not using the SciTech GLDirect 3.0 driver beta seen in the video because it requires a card capable of higher DirectX versions than 6, which is what the S3 ViRGE supports. I may revisit this video idea with a later graphics card in a future video for more thorough testing.
Using the 2.00G wrapper, Minecraft Alpha and Beta have many visual bugs. Text is not rendered at all, for example, and the world selection screen is eerily dim compared to what it should be. Beta in particular extends this darkness to the title screen as well.
Under 2.00G, Minecraft Beta 1.7.3 with Optifine no longer has this darkness.
Under 2.00G, Minecraft Release 1.5.2... inverts the colors of the Mojang logo?
Did you know that if you fall out of the Half Life intro tram in just the right place, you get to see the multiverse?
The framerate starts to rise when this happens, as it appears that the tram after becoming unstuck will trigger the next loading screen. Unfortunately, this loading screen appears to be what seals your fate. Once the tram stops though you at least get to meet biblically-accurate Half Life rendering in a smooth double-digit framerate!
Performance (2.00C)
GLGears: 63 FPS
ClassiCube 1.3.5/1.3.6: Unknown; game fails to render
Minecraft in-20091223-1459: Unknown; crash on game launch
Minecraft Alpha 1.0.4: Unknown; crash on game launch
Minecraft Beta 1.7.3: Unknown; crash on game launch
Minecraft Beta 1.7.3 w/Optifine: Unknown; crash on game launch
Minecraft Release 1.5.2: Unknown; assumed crash on game launch based on previous versions' behavior
Tux Racer: Unknown; game fails to render (no longer seizure inducing at least)
Half Life: Unknown; crash on game launch
GL Excess: Unknown; instant crash
Performance (2.00G)
GLGears: 390 FPS; only a black screen was rendered.
ClassiCube 1.3.5/1.3.6: 10-30 FPS range, ~12-15 FPS on average; most of the game still does not render, but text, the hotbar, hand items, and very occasional flickers of geometry do render. Seizure warning.
Minecraft in-20091223-1459: Unknown; crash on world load
Minecraft Alpha 1.0.4: Unknown; crash on world load
Minecraft Beta 1.7.3: Unknown; crash on world load
Minecraft Beta 1.7.3 w/Optifine: Unknown; crash on world load
Minecraft Release 1.5.2: Unknown; crash on game launch
Tux Racer: Unknown; crash on game launch
Half Life: 4-5 FPS; game physics are almost entirely broken down and I'm pretty sure you end up phasing into a higher plane of existence along the way. Trying to enter the training room crashed the entire PC afterwards.
GL Excess: Unknown; instant crash
GLDirect 3.00
youtube
OpenGL Version String (3.00G)
Vendor: SciTech Software, Inc. Renderer: GLDirect Version: 1.1
OpenGL Version String (3.00C)
Vendor: SciTech Software, Inc. Renderer: GLDirect S3 Inc. ViRGE/DX/GX Version: 1.2 Mesa 3.1
GLDirect 3.00, both the CAD and Game versions, appears to behave identically to GLDirect 2.00 in almost all cases unless stated otherwise.
Performance (3.00G)
GLGears: 249 FPS; gears are rendered completely incorrectly
ClassiCube 1.3.5: 15-20 FPS on average; most of the game fails to render and the system hard-crashes after a few seconds.
ClassiCube 1.3.6: Insta-crash.
Minecraft: Crash on world load across all versions. Didn't bother testing Release 1.5.2.
Tux Racer: Unknown; crash on game launch
Half Life: ~4 FPS; extremely choppy audio in tutorial level.
GL Excess: Unknown; instant crash
Performance (3.00C)
GLGears: 80 FPS; Perfectly rendered
ClassiCube 1.3.5/1.3.6: Unknown; renders a white screen
Minecraft: Crashes on game launch. The game may also complain about color depth here.
Tux Racer: Unknown; renders a white screen
Half Life: Unknown; renders a white screen and then crashes on game launch
GL Excess: Unknown; instant crash
Techland S3D
We've now moved on from GLDirect! From here on out, each wrapper is instead a discrete opengl32.dll file that must be dropped into the folder of whatever program you'd like to run it with.
OpenGL Version String
Vendor: Techland Renderer: S3 Virge 3093KB texmem KNI Version: 1.1 beta 6
Right off the bat, things appear to be taking a turn for the interesting as GLGears fails to render anything.
Performance
GLGears: 60 FPS, but only because a black screen is rendered.
ClassiCube 1.3.5/1.3.6: Crashes on game launch but renders a solid blue screen.
Minecraft in-20091223-1459: We load into a world! Have fun trying to play it though. Rendering is very flickery and broken. It may be possible that there's an issue of some kind with z-buffering. 12-15 fps if that matters at all in this case.
Minecraft Alpha 1.0.4: Crashes on game launch.
Minecraft Beta 1.7.3: Renders an inverted vignette that slowly grows darker.
Minecraft Beta 1.7.3 w/Optifine: Crashes on world load.
Minecraft Release 1.5.2: Rendered the title screen with many errors for a brief moment before turning to a black screen and crashing.
Tux Racer: Unknown; renders mostly a white screen. The game does respond to user inputs, and the rendered scene changes based on those inputs, but no textures or complex objects are ever rendered. Instead, you only get the white "floor" plane, a solid blue skybox, and translucent boxes where text, objects, and particles should've been.
Half Life: Crash on game launch; absolutely RAVAGES the title screen in ways I thought only the Alliance AT3D could.
GL Excess: Actually loads! But renders only solid black or white screens after the initial loading screen.
Techland D3D
youtube
Two more wrappers left after this one! This wrapper and the Techland S3D wrapper that came before it were both created by Techland originally for their "Crime Cities" game, which, being an OpenGL title during an era where support was spotty at best across the dozens of vendors that existed at the time, necessitated the creation of a set of OpenGL wrappers that could translate calls to other APIs.
Anyway, I originally thought that there wasn't going to be much that'd be interesting about this wrapper based on how Minecraft (didn't) work, but I was quickly proven wrong in the things I tested afterwards!
Vendor: Techland Renderer: Direct3D (display, Primary Display Driver) KNI Version: 1.1 beta 6
Performance
GLGears: ~290 FPS, but only because a black screen is rendered.
ClassiCube 1.3.5/1.3.6: 2-5 FPS. Runs with z-buffering issues. Faster than software rendering, but not by much (SW rendering was roughly 1-3 FPS)
Minecraft: Renders only a black screen and (usually) crashes.
Tux Racer: Unknown; renders a mostly-white screen. However, shading works well enough that you can just barely distinguish what is what.
Half Life: Crash on game loading after the title screen; absolutely RAVAGES the title screen in ways I thought only the Alliance AT3D could.
GL Excess: Actually runs! Performance is alright in some parts but generally remains low. Also, it erroneously uses the exact same texture for every single rendered object.
S3Mesa/AltOGL
youtube
So, it turns out that there may or may not be some level of bugginess involved with S3Mesa and AltOGL on my current testing setup. Whereas both wrappers were (somewhat) stable and able to load into both Minecraft and Half Life with relatively little trouble in previous experiments with AltOGL/S3Mesa and the Virge, this time around it doesn't appear to be working as expected. There may be something that got screwed up thanks to having installed and used multiple different versions of GLDirect prior to using S3Mesa and AltOGL.
The best-case scenario would have been to start off with a fresh install of Windows 98 with every test run with a different wrapper, but in this case I didn't do that. As a result, the findings for S3Mesa and AltOGL here should be considered with a grain of salt.
Vendor: Brian Paul Renderer: Mesa S3GL V0.1 Version: 1.1 Mesa 2.6
Vendor: Brian Paul Renderer: altD3D Version: 1.2 Mesa 3.0
Performance - AltOGL
GLGears: 125 fps
ClassiCube 1.3.5/1.3.6: crash + system lockup
Minecraft in-20091223-1459: crashes on world load (normally it's much more stable according to previous videos. See here for reference: Indev on S3 ViRGE using AltOGL
Minecraft: Aside from Beta 1.7.3 with Optifine which crashes on world load, all versions of the game crash instantly.
Tux Racer: Freezes on main menu
Half Life: Instacrash
GL Excess: Instacrash
Performance - S3Mesa
GLGears: i forgor to run it 💀
ClassiCube 1.3.5/1.3.6: no crash but everything is rendered in solid white rather than with textures. performance is likely in the single digits.
Minecraft in-20091223-1459: crashes on world load (normally it's much more stable according to previous videos. See here for reference: Indev on S3 ViRGE using S3Mesa
Minecraft: Aside from Beta 1.7.3 with Optifine which crashes on world load, all versions of the game crash instantly.
Tux Racer: Renders just about everything except for the game terrain and player character. Performance seems to just barely reach double-digits.
Half Life: Instacrash; textures missing on main menu (similarly to Minecraft indev, this is an unexpected result as the game normally is able to play.)
GL Excess: Instacrash
Misc Notes
Running HWinfo32 crashes the system regardless of if KernelEX is enabled or not.
GLDirect 5 refuses to do anything other than software rendering due to the lack of higher DirectX support by the Virge.
GLDirect 3 can "run", but crashes the entire system after a few seconds in Classicube using the "game" wrapper.
GLDirect 3, unlike 5, has no license code available for free use.
I didn't have the opportunity to record during the time GLDirect 3's trial was active, and it expired already so to avoid having to reinstall everything to get a new trial (I've already tried to roll back the system calendar), I will instead be using GLDirect 2, which I have in fact found a license code for.
GLDirect 2 has a fourth option for OpenGL acceleration beyond the CAD/Game DirectX hardware acceleration and the CPU software acceleration, which is a DX8-based wrapper that appears to have later debuted fully in GLDirect 3.x and up. I think it's safe to assume that the CAD/Game DX6-based wrappers were then deprecated and received no further development after GLDirect 2.x given the pivot to a newer DirectX API.
There are a number of other wrappers available for the S3 ViRGE, and even an OpenGL MCD for Windows 2000. However, I'm sticking with the five that were listed here as they implement enough of the OpenGL spec to be (more) widely compatible with a number of games. The S3Quake wrapper, for example, implements only enough of the spec to run GLQuake, and I have never gotten it to even launch any other titles.
I really, sincerely didn't expect MC Beta 1.7.3 to launch at all even with Optifine, given how from prior testing I found that the game would instantly crash on anything higher than Classi--nevermind, it just crashed on world-load. (S3Mesa, AltOGL, TechlandD3D, TechlandS3D, GLDirect 1.x)
Non-Optifine Beta 1.7.3 crashes before even hitting the Mojang splash screen. (S3Mesa, AltOGL, TechlandD3D, GLDirect 1.x)
Non-Optifine Beta 1.7.3 gets into a world! All text is missing though as are the blocks. Performance is expectedly horrendous. (TechlandS3D)
Making batch files to handle version-switching instead of doing it by hand is nice.
Alongside the system setup for testing apparently no longer being ideal for using S3Mesa and AltOGL, comparison against some test runs done on an Athlon XP 1.1GHz system also reveal that the Virge performs far better with a faster CPU like the aforementioned Athlon XP than the 750MHz Pentium that this series of experiments is built upon. This experiment as a result may be eventually redone in the future using that faster CPU.
0 notes
Text
I did some more programming #28
I also applied for a job, but that's neither here nor there.
I started moving from the ogl33 crate to the gl crate. It's mostly done, I can feel how much better made these libraries are, the gl one and the sdl one. Did I mention that I also moved over from beryllium or whatever I was using before to the actual sdl2 crate.
I understand somewhat why the tutorial used the other crates, they're a subset of the complete ones, so for a simple tutorial like that, the smaller one's are more convenient.
Regardless, the code is almost completely migrated, I'm just having issues with getting sdl to load the opengl functions, and I still need to rewrite event handling. The beryllium wrapper used a fundamentally different system than the actual sdl2 crate does. From what I've seen, the sdl2 way is better. Something about "eventPumps"?
Anywho, progress is progress. I get closer every day I guess. What with school starting up, progress might slow I think, but it is what it is.
Another thing I was thinking about was the 'one project a week' schedule. I think the different types of projects take different amounts of time. I have experience with writing, and I don't have experience with drawing, but for certain simple things like what I made last time, not as much time is needed. I have no idea how to make music, so that will take a lot of time, but making a module for programming will probably be pretty fast (this current project is primarily boilerplate. That type of work and refactoring are both intensive, and so take much longer). I think I will plan to have different amounts of time for each: writing projects get 3-4 days, drawing gets 5-6, music gets 12-13 and programming gets 6-7. On average, still a 28 day cycle, but the allocation is shifted to reflect my skill. It might shift again, but I think this is a lot more doable, specifically for music.
Although, I feel like the only problem I had was that I just couldn't start. Analysis paralysis kept me from doing anything. I couldn't put any kind of noise into the stupid program because I was to anxious about it being good. It feels like the solution is to just start and stop worrying about how good it will be, but at the same time I know that my lack of confidence comes from a lack of knowledge about audio, music, and music theory. I know that the solution to that root is to learn, and learning takes time. In this situation I need to trust what I know over what I feel and give that time to the problem.
Once I finish this boilerplate bs, I'm going to do 2 drawings in a row, then some writing, then audio, then programming. I think I'm going to ditch the whole "make a big multi-media every month" bit. I want to make those kinds of things, so if I do what I want, I will do that sometimes. Having structure is good, but too much is... restrictive, or maybe, exhausting? inorganic? over-rigid? I found that as much as having the structure helps, my ultimate goal is to make things. I am sometimes succeeding at that goal, but still feeling bad because I went over my personal deadline, or failing to achieve that goal in the name of my arbitrary deadlines. I need to balance limits for preventing never-ending projects, while still giving myself reasonable amounts of time and leniency so that I can actually finish things instead of forcing myself to move on from a project that is actually not long from being done. It's a hard balancing act, but I'm learning.
0 notes
Text
I have decided to fork the glquake src that id posted to GitHub, and I will try to use a wrapper for OpenGL and citro3D.
I am currently thinking about how I want to go about how I want to familiarize myself with the Nintendo 3DS hardware.
I’m thinking about trying to get quakespasam to work since it has SDL 1.2.15 support. The biggest issue I’d need to tackle is hardware acceleration. And maybe getting the music to play.
2 notes
·
View notes
Text
Legacy AMD APU Llano Laptop for Emulation tests - Part 2
Software tools and Emulators used.
I covered everything about my laptop and its history. I went from Windows 10 to Manjaro XFCE 19 and installed needed software to get great performance and needed emulators to suit my laptop's hardware. I find AUR builds easy again because I can find plugins and needed standalone emulators for good testing. AUR builds work almost perfect for me, aside from some build time, but as long as it can installed a program or an extension, I'm good.
For my laptop, I installed TPC, or TurionPowerControl to overclock my laptop's CPU. The bios doesn't have an option for permanent overclock, since it's a laptop. However, the CPU has used much voltage by default for big room for overclocking with little higher power draw. At stock, it run at base 1.5Ghz at 1.1625V and 2.4Ghz boost at 1.415V. It can easily be undervolted to cool the laptop a bit. The temp limit is at 85C, so it can throttle the CPU if it reaches around that point. Undervolt would go down to 1.5Ghz at 1.0625V and 2.4Ghz boost at 1.200V, which brings down about seven percent deop on temperature, and little more battery life. Overclock would hit up to 2.3Ghz at 1.175V and 2.8Ghz at 1.400V, only little voltage change on each, and still runs stable. Note that each CPUs can differ silicon quality that can reach lower or higher voltage for overclocking or undervolting. Averagely, my laptop's CPU performance would be 33% increase. The Llano APUs are one of the exceptions that you can overclock your laptop without much worry.
I installed Gamemode for Linux, and I explained in the last page. It's useful for setting power mode for both CPU and GPU to performance mode, so it will use highest clock speed as much as it can. Radeon-Profile is an app on Linux that can force power mode on GPU if you run a program if you can't use gamemode for whatever reason. On Windows, you can force performance mode on Catalyst driver, and have Windows set to performance mode on power setup. Drivers used on Linux as of this writing is Mesa r600g Driver 19.3.
Now let's get to the list of emulators I will be using for each system. Some are using Retroarch.
NES: I'm using Nestopia and Mesen on Retroarch. Both are quite accurate, with the latter being the most accurate. Nestopia is the fastest option, but Mesen can run pretty smoothly too with default settings. Nestopia has much more headroom for Runahead and NES CPU overclocking. I use FireBrandX's digital palettes.
SNES: I use Snes9x Mainline for Retroarch and new standalone Bsnes v110 for test. Snes9x runs very well and can use both features listed above. Bsnes is used for testing mostly. I'll explain about Bsnes, but what I can say is it can run a lot of games at fullspeed. No need for Libretro's old many Bsnes cores in my opinion and Snes9x Mainline is suitable generally. Bsnes AUR builds are available.
N64: On Windows, it was Project64 with Jabo's D3D8 1.6.1 Plugin, but after switching to Linux, Mupen64Plus with GlideN64 while using Mesa drivers offers a better option. Since February 2020, M64p is free once again so you don't have to do DIY build for each plugin to use with Mupen64plus, and get good GUI. Just go to this website: https://github.com/loganmc10/m64p/releases . If it doesn't go free in the future, you would have to use AUR files to build plugins and find the gui. There is Mupen64Plus Next for Retroarch, but a standalone build is the fastest and more reliable option.
Gamecube/Wii: I use beta builds of Dolphin to measure it. We'll explain about that later on.
Sega SMS/Genesis/GameGear/32x: I use Genesis GX Plus on Retroarch, and it runs pretty great. For 32x, I can use Kega Fusion for Windows, or Picodrive core, but I don't have 32x game to test.
Sega Saturn: I use Yaba Sanshiro, and it's the fastest emulator you can get for the laptop. Yaba Sanshiro has some great options.
Sega Dreamcast: I installed Redream. It is the fastest dreamcast emulator available and more accurate than NullDC. I do have a fork of Reicast called Flycast that is used for testing too. https://flyinghead.github.io/flycast-builds/ (ubuntu build is just linux build). I do recommend using Flycast standalone builds instead of Libretro core one since standalone Hardware-based rendering emulators often run faster than on Retroarch.
Playstation 1: I use both PCSX-R PGXP and PCSX-Rearmed core. PCSX-R has great option for perspective correction and much less jittering polygons while being faster than Libretro's Beetle PSX core and standalone Mednafen. The explanation on those four emulators and the windows PCSX-R on Pete's OpenGL2 Tweak will also be mentioned.
Playstation 2: I use PCSX2.
GB/GBC: Sameboy. It is more accurate than Gambatte and VBA.
GBA: mGBA, and it is the fastest and most accurate GBA emulator I used.
NDS: Despite development drama, I use standalone Desmume mainline. It's pretty good for this laptop since you have an option to use frameskip. MelonDS with JIT will be included.
3DS: Citra Canary is the best option.
PSP: PPSSPP.
Dos: Dosbox ECE or Dosbox-X at best. ECE install for Dosbox is difficult, and you would need to build one, but I wanted a 32bit build since the dynamic core is pretty robust in 32bit. However on Linux, some forks or main are kinda hard to find 32bit version to have full speed for dynamic recompiler since 64bit is slower or has bugs. At the end, I use Dosbox-X, and it performs the same as standard Dosbox on normal mode. On Dynamic mode, you would need 32bit version to get the exact on either Windows or Linux.
Win9x: PCEM. Using v15 and use 486 CPUs.
Wine: Linux specific to run Windows programs. You can use Stable builds or Staging builds. Lutris and Proton can be used. Although, since my laptop's APU lacks Vulkan support, running DX11 games are much harder and barely run. The APU isn't really that strong for many DX11 games anyway. DX9 works with default OpenGL wrapper, but since we're using AMD GPU with a Mesa Gallium Driver, we can run Native DX9 API on Wine with Gallium Nine Standalone. You just install needed dependencies for Mesa D3D9 files to have Gallium Nine config enable Native DX9. DX10 is the same story as DX11. DX8 and lower works pretty good for the most part. Note that Wine is not an Emulator, but a compatibility layer.
Those are the softwares that will be used for performance testing on the next page.
Next Page on CPU emulation tests.
Previous Page on the laptop overview.
1 note
·
View note
Text
Wineskin winery 2.6.2

Wineskin winery 2.6.2 how to#
Wineskin winery 2.6.2 install#
Wineskin winery 2.6.2 update#
Wineskin winery 2.6.2 driver#
Wineskin winery 2.6.2 software#
Wineskin winery 2.6.2 how to#
If even that doesn't work I'll add a section in my guide later that explains how to revert your Wine version to an older one that may work better. You can try disabling as many graphical effects in the osu! options and if that doesn't work, enable compatibility mode. Only thing I can think of is OpenGL issues but your GL Version is only slightly lower than mine so I doubt it's a huge issue. Some help would be appreciated on this one, thanks soon as the song is lost i'm able to see the game just fine once again. i've installed yesterday and every time i play whenever the HP bar starts dropping to around 10-15% the game blacks out, really annoying cause then i just stop being able to see the game and obviously always lose as i can't keep playing to come back. The steps to do this are listed under How to do a data backupĬatitaa6 wrote:thank you! runs 100%, except for one problem. If you haven't already make sure to do a data backup. However, there are several things you can try but at worst you'll have to reinstall the Wineskin package. Unfortunately for you, there aren't a concrete set of steps to resolve this.
Wineskin winery 2.6.2 update#
Oh no! I'm stuck in an update loop and/or I'm crashing on startup! Click on that strip and then the menus will display. You'll notice a strip of black on top of "compose design timing song setup". The menu bar is still there, just that it's completely black. Set your keyboard layout to the US layout to fix this. This is especially true for keyboard layouts that produce a character other than those part of the ASCII format, such as Japanese, Chinese or Korean. Keyboard layouts other than the default US layout may not work with osu!. See the section: Discord Rich Presence doesn't work! Download a copy of the working discord-rpc.dll here:.Here is how to disable the quarantine attribute. The quarantine attribute will prevent osu! from writing files and performing all the actions that it needs to. Sometimes, the quarantine attribute will still be stuck on the Wineskin wrapper.
Wineskin winery 2.6.2 install#
You will also need to download and install XQuartz ( ). Consider this as a last resort if nothing else works. Follow the steps for Help! My game won't open and/or opens this program called XQuartz! but in reverse. In this case, you may wish to reenable XQuartz as the graphical engine for the Wineskin. If your computer is rather old or you're running macOS Mojave you may experience problems with various elements of the Wineskin including performance or UI elements. Osu! runs really slow and/or the Song Setup window does not show Uncheck "Use Direct3D Boost (if available)".Uncheck "Auto Detect GPU Info for Direct3D".Click "Set Screen Options" (Under the "Tools" tab in Wineskin).Now open Wineskin Winery and follow these steps: To update the engine we'll need a way to download engines.
Wineskin winery 2.6.2 driver#
Updating the engine provided me with a significant performance boost, fixed many of the issues I was having previously, and it allowed my game to run under OSX driver mode (more on that later). The engine is essentially what runs osu! so this is one of the more volatile steps.
Wineskin winery 2.6.2 software#
The engine included with the osu! Wineskin is incredibly outdated (In software development, 2 - 3 years is a very long time). I highly recommend you do this regardless of whether you are facing issues or not. Next you'll need to update the Wine engine. Don't close the Wineskin window yet we'll use it later. The wrapper version (shown near the bottom of the window above "Install Software") should now be 2.6.2 or higher.
Click "Update Wrapper" (in the right column).
Click "Tools" (next to Configuration and Options).
The wrapper is rather outdated and probably won't fix anything but it can't really hurt. If you haven't already, start by updating the Wineskin wrapper. In the event that happens you may have to start from scratch and redownload the Wineskin wrapper from the OSX osu! tumblr (found here: ) Some of these steps initially made my game crash or stuck in an update loop. I highly recommend you back up the entire osu! folder. Osu! folder can be found by going to Show package contents > drive_c > osu! Wineskin can be found right clicking the osu! icon and clicking "Show package contents". This is assuming you are using the Wineskin wrapper to run osu! on OSX. Over the course of playing osu! on OSX I've experienced multiple issues and found ways to fix them.

0 notes
Text
Retroarch crashes when loading rom

RETROARCH CRASHES WHEN LOADING ROM INSTALL
RETROARCH CRASHES WHEN LOADING ROM DRIVERS
RETROARCH CRASHES WHEN LOADING ROM PC
atmosphère crashes when loading Retroarch For $4 / €4 a month or $20 / €20 a year, players Scroll down to Options Last Release: Downloads: 218981.
RETROARCH CRASHES WHEN LOADING ROM DRIVERS
Supported Graphics Drivers gl1 via GLDirect (D3D9 -> OpenGL 1.1 wrapper) d3d11 d3d12 gdi Supported Input Drivers winraw xinput Tested (Provided) Cores fbalpha fceumm snes9x2005 gambattle vba_next genesis_plus_gx mednafen_psx Notes For Windows RT devices, you’ll need a proper jailbreak for your device, and (on 8.1) sign the RetroArch.exe and core … I have closed the content manually before loading a new one, but the crash still occurs.
RETROARCH CRASHES WHEN LOADING ROM INSTALL
But don’t fire up your browser: you can install cores from inside RetroArch. These individual emulators are called cores within RetroArch, and you’re going to need to download the appropriate cores for the games you want to run. RetroArch isn’t itself an emulator instead, it’s a front-end capable of running a wide number of emulators. When I boot RetroArch up, it typically will launch whatever content just fine the first time. RetroArch will crash when ran with invalid arguments that do not that with a dash (-). Fixed ports gamelist.xml - Updated es_systems.cfg - Updated RetroOz Controls.pdf Can you post a screenshot of what it shows from the beginning all the way down to "Lock Installed Core"? To fix this, head to Controls and set “User 1 Analog To Digital Type” to Left Analog. I’ve been trying to load Nemesis '90 Kai using the PX68k core and each time I just get a brief white screen followed by the frontend crashing. Travel through time by exploring 's entertainment news archives, with 30+ years of entertainment news content. Make sure “vulkan” is selected or use “opengl” if your GPU doesn’t support it. If the version 2.0.2 is crashing after run bios/game, or you are using the Russian language, please update, in other cases it is not necesary. It doesn’t happen if I select the restart RetroArch option in between attempts at loading content. RetroArch's PlayStation 1 emulation is pretty awesome.
RETROARCH CRASHES WHEN LOADING ROM PC
Crashing on startup - PC RetroArch Gaming Consequently, when you press the select button on your controller in a NES game to make a selection in a menu, in The Legend of Zelda, for example, nothing will happen as RetroArch is expecting a second. Im using Atmosphere 0 Retroarch crashing Okay, every time I try to load a game on Mupen64plus and Mupen64Plus Next it crashes Retroarch RetroArch is a modular multi-system emulator system that is designed to be fast, lightweight This time we are going to install RetroArch to the Switch belenjer 23:42 … Search: Retroarch Crashing Switch. Launch the game or core you're having trouble with and let it crash/fail to load. I also want to say that N64 Games don't load up.

0 notes
Text
Java lwjgl spotlights

#Java lwjgl spotlights install#
#Java lwjgl spotlights 64 bits#
Maven builds projects based on an XML file named pom.xml (Project Object Model) which manages project dependencies (the libraries you need to use) and the steps to be performed during the build process. Just open the folder that contains the chapter sample and IntelliJ will detect that it is a maven project. Maven is already integrated in most IDEs and you can directly open the different samples inside them. įor building our samples we will be using Maven. IntelliJ provides a free open source version, the Community version, which you can download from here.
#Java lwjgl spotlights 64 bits#
Since Java 10 is only available, by now, for 64 bits platforms, remeber to download the 64 bits version of IntelliJ. You can download IntelliJ IDEA which has good support for Java 10. You may use the Java IDE you want in order to run the samples. This book assumes that you have a moderate understanding of the Java language.
#Java lwjgl spotlights install#
Just choose the installer that suits your Operating System and install it. We will be using Java 10, so you need to download the Java SDK from Oracle’s pages. The benefit of doing it this way is that you will get a much better understanding of 3D graphics and also you can get better control.Īs said in the previous paragraphs we will be using Java for this book. By using this low level API you will have to go through many concepts and write lots of lines of code before you see the results. If your idea is to start creating 3D games in a short period of time maybe you should consider other alternatives like. LWJGL is a low level API that acts like a wrapper around OpenGL. The LWJGL library enables the access to low-level APIs (Application Programming Interface) such as OpenGL. We will develop our samples in Java and we will use the Lightweight Java Game Library ( LWJGL). In this book we will learn the principal techniques involved in developing 3D games.

0 notes
Text
Writeup: Forcing Minecraft to play on a Trident Blade 3D.
The first official companion writeup to a video I've put out!
youtube
So. Uh, yeah. Trident Blade 3D. If you've seen the video already, it's... not good. Especially in OpenGL.
Let's kick things off with a quick rundown of the specs of the card, according to AIDA64:
Trident Blade 3D - specs
Year released: 1999
Core: 3Dimage 9880, 0.25um (250nm) manufacturing node, 110MHz
Driver version: 4.12.01.2229
Interface: AGP 2x @ 1x speed (wouldn't go above 1x despite driver and BIOS support)
PCI device ID: 1023-9880 / 1023-9880 (Rev 3A)
Mem clock: 110MHz real/effective
Mem bus/type: 8MB 64-bit SDRAM, 880MB/s bandwidth
ROPs/TMUs/Vertex Shaders/Pixel Shaders/T&L hardware: 1/1/0/0/No
DirectX support: DirectX 6
OpenGL support: - 100% (native) OpenGL 1.1 compliant - 25% (native) OpenGL 1.2 compliant - 0% compliant beyond OpenGL 1.2 - Vendor string:
Vendor : Trident Renderer : Blade 3D Version : 1.1.0
And as for the rest of the system:
Windows 98 SE w/KernelEX 2019 updates installed
ECS K7VTA3 3.x
AMD Athlon XP 1900+ @ 1466MHz
512MB DDR PC3200 (single stick of OCZ OCZ400512P3) 3.0-4-4-8 (CL-RCD-RP-RAS)
Hitachi Travelstar DK23AA-51 4200RPM 5GB HDD
IDK what that CPU cooler is but it does the job pretty well
And now, with specs done and out of the way, my notes!
As mentioned earlier, the Trident Blade 3D is mind-numbingly slow when it comes to OpenGL. As in, to the point where at least natively during actual gameplay (Minecraft, because I can), it is absolutely beaten to a pulp using AltOGL, an OpenGL-to-Direct3D6 "wrapper" that translates OpenGL API calls to DirectX ones.
Normally, it can be expected that performance using the wrapper is about equal to native OpenGL, give or take some fps depending on driver optimization, but this card?
The Blade 3D may as well be better off like the S3 ViRGE by having no OpenGL ICD shipped in any driver release, period.
For the purposes of this writeup, I will stick to a very specific version of Minecraft: in-20091223-1459, the very first version of what would soon become Minecraft's "Indev" phase, though this version notably lacks any survival features and aside from the MD3 models present, is indistinguishable from previous versions of Classic. All settings are at their absolute minimum, and the window size is left at default, with a desktop resolution of 1024x768 and 16-bit color depth.
(Also the 1.5-era launcher I use is incapable of launching anything older than this version anyway)
Though known to be unstable (as seen in the full video), gameplay in Minecraft Classic using AltOGL reaches a steady 15 fps, nearly triple that of the native OpenGL ICD that ships with Trident's drivers the card. AltOGL also is known to often have issues with fog rendering on older cards, and the Blade 3D is no exception... though, I believe it may be far more preferable to have no working fog than... well, whatever the heck the Blade 3D is trying to do with its native ICD.
See for yourself: (don't mind the weirdness at the very beginning. OBS had a couple of hiccups)
youtube
youtube
Later versions of Minecraft were also tested, where I found that the Trident Blade 3D follows the same, as I call them, "version boundaries" as the SiS 315(E) and the ATi Rage 128, both of which being cards that easily run circles around the Blade 3D.
Version ranges mentioned are inclusive of their endpoints.
Infdev 1.136 (inf-20100627) through Beta b1.5_01 exhibit world-load crashes on both the SiS 315(E) and Trident Blade 3D.
Alpha a1.0.4 through Beta b1.3_01/PC-Gamer demo crash on the title screen due to the animated "falling blocks"-style Minecraft logo on both the ATi Rage 128 and Trident Blade 3D.
All the bugginess of two much better cards, and none of the performance that came with those bugs.
Interestingly, versions even up to and including Minecraft release 1.5.2 are able to launch to the main menu, though by then the already-terrible lag present in all prior versions of the game when run on the Blade 3D make it practically impossible to even press the necessary buttons to load into a world in the first place. Though this card is running in AGP 1x mode, I sincerely doubt that running it at its supposedly-supported 2x mode would bring much if any meaningful performance increase.
Lastly, ClassiCube. ClassiCube is a completely open-source reimplementation of Minecraft Classic in C, which allows it to bypass the overhead normally associated with Java's VM platform. However, this does not grant it any escape from the black hole of performance that is the Trident Blade 3D's OpenGL ICD. Not only this, but oddly, the red and blue color channels appear to be switched by the Blade 3D, resulting in a very strange looking game that chugs along at single-digits. As for the game's DirectX-compatible version, the requirement of DirectX 9 support locks out any chance for the Blade 3D to run ClassiCube with any semblance of performance. Also AltOGL is known to crash ClassiCube so hard that a power cycle is required.
Interestingly, a solid half of the accelerated pixel formats supported by the Blade 3D, according to the utility GLInfo, are "render to bitmap" modes, which I'm told is a "render to texture" feature that normally isn't seen on cards as old as the Blade 3D. Or in fact, at least in my experience, any cards outside of the Blade 3D. I've searched through my saved GLInfo reports across many different cards, only to find each one supporting the usual "render to window" pixel format.
And with that, for now, this is the end of the very first post-video writeup on this blog. Thank you for reading if you've made it this far.
I leave you with this delightfully-crunchy clip of the card's native OpenGL ICD running in 256-color mode, which fixes the rendering problems but... uh, yeah. It's a supported accelerated pixel format, but "accelerated" is a stretch like none other. 32-bit color is supported as well, but it performs about identically to the 8-bit color mode--that is, even worse than 16-bit color performs.
At least it fixes the rendering issues I guess.
youtube
youtube
#youtube#techblog#not radioshack#my posts#writeup#Forcing Minecraft to play on a Trident Blade 3D#Trident Blade 3D#Trident Blade 3D 9880
3 notes
·
View notes
Text
Visual studio 2014 download free. full version 64 bit

Visual studio 2014 freeload full version 64 bit software#
Visual studio 2014 freeload full version 64 bit code#
Third party wrappers are also available for Python, Perl, Fortran, Java, Ruby, Lua, Common Lisp, Haskell, R, MATLAB, IDL, Julia, and native support in Mathematica. In addition to libraries, compiler directives, CUDA C/C++ and CUDA Fortran, the CUDA platform supports other computational interfaces, including the Khronos Group's OpenCL, Microsoft's DirectCompute, OpenGL Compute Shader and C++ AMP. Fortran programmers can use 'CUDA Fortran', compiled with the PGI CUDA Fortran compiler from The Portland Group. C/C++ programmers can use 'CUDA C/C++', compiled to PTX with nvcc, Nvidia's LLVM-based C/C++ compiler.
Visual studio 2014 freeload full version 64 bit software#
The CUDA platform is accessible to software developers through CUDA-accelerated libraries, compiler directives such as OpenACC, and extensions to industry-standard programming languages including C, C++ and Fortran. Copy the resulting data from GPU memory to main memory.GPU's CUDA cores execute the kernel in parallel.Copy data from main memory to GPU memory.When it was first introduced, the name was an acronym for Compute Unified Device Architecture, but Nvidia later dropped the common use of the acronym.
Visual studio 2014 freeload full version 64 bit code#
CUDA-powered GPUs also support programming frameworks such as OpenMP, OpenACC and OpenCL and HIP by compiling such code to CUDA.ĬUDA was created by Nvidia. This accessibility makes it easier for specialists in parallel programming to use GPU resources, in contrast to prior APIs like Direct3D and OpenGL, which required advanced skills in graphics programming. ĬUDA is designed to work with programming languages such as C, C++, and Fortran. CUDA is a software layer that gives direct access to the GPU's virtual instruction set and parallel computational elements, for the execution of compute kernels. CUDA (or Compute Unified Device Architecture) is a parallel computing platform and application programming interface (API) that allows software to use certain types of graphics processing unit (GPU) for general purpose processing, an approach called general-purpose computing on GPUs ( GPGPU).

0 notes
Text
Nvidia quadro m5000

NVIDIA QUADRO M5000 SOFTWARE
NVIDIA QUADRO M5000 CODE
Third party wrappers are also available for Python, Perl, Fortran, Java, Ruby, Lua, Common Lisp, Haskell, R, MATLAB, IDL, Julia, and native support in Mathematica. In addition to libraries, compiler directives, CUDA C/C++ and CUDA Fortran, the CUDA platform supports other computational interfaces, including the Khronos Group's OpenCL, Microsoft's DirectCompute, OpenGL Compute Shader and C++ AMP. Fortran programmers can use 'CUDA Fortran', compiled with the PGI CUDA Fortran compiler from The Portland Group. C/C++ programmers can use 'CUDA C/C++', compiled to PTX with nvcc, Nvidia's LLVM-based C/C++ compiler, or by clang itself.
NVIDIA QUADRO M5000 SOFTWARE
The CUDA platform is accessible to software developers through CUDA-accelerated libraries, compiler directives such as OpenACC, and extensions to industry-standard programming languages including C, C++ and Fortran.
Copy the resulting data from GPU memory to main memory.
GPU's CUDA cores execute the kernel in parallel.
Copy data from main memory to GPU memory.
When it was first introduced, the name was an acronym for Compute Unified Device Architecture, but Nvidia later dropped the common use of the acronym.
NVIDIA QUADRO M5000 CODE
CUDA-powered GPUs also support programming frameworks such as OpenMP, OpenACC and OpenCL and HIP by compiling such code to CUDA.ĬUDA was created by Nvidia. This accessibility makes it easier for specialists in parallel programming to use GPU resources, in contrast to prior APIs like Direct3D and OpenGL, which required advanced skills in graphics programming. ĬUDA is designed to work with programming languages such as C, C++, and Fortran. CUDA is a software layer that gives direct access to the GPU's virtual instruction set and parallel computational elements, for the execution of compute kernels. CUDA (or Compute Unified Device Architecture) is a parallel computing platform and application programming interface (API) that allows software to use certain types of graphics processing units (GPUs) for general purpose processing, an approach called general-purpose computing on GPUs ( GPGPU).

1 note
·
View note
Text
Instructions in JOGL Project 1 Info.txt Programming OpenGL using the Java OpenGL
Instructions in JOGL Project 1 Info.txt Programming OpenGL using the Java OpenGL
Instructions in JOGL Project 1 Info.txt Programming OpenGL using the Java OpenGL (JOGL) wrapper library, follow all instructions in the .txt file, relevant textbooks and lecture notes (Week 8 Notes – 2D Transformation.pptx) included. All JOGL2020 files used to derive/extend classes for the class is also attached in a zip file (JOGL2020.zip) for easy installation on Eclipse IDE on your Windows…
View On WordPress
0 notes
Text
Scarface Pc Game Crack File

MegaGames - founded in 1998, is a comprehensive hardcore gaming resource covering PC, Xbox One, PS4, Wii U, Mobile Games, News, Trainers, Mods, Videos, Fixes, Patches. Sticky -Action Adult Adventure Announcement Arcade Games Big-Games Board-Games Casual Console Games Dash Game Feature Fighting Flash Games Flying Simulator FPS FTP-DIRECT GBA Hidden Object HOTFILE HTTP Hunting Game Indie MEGAUPLOAD MISC NDS NETLOAD On-line Patch-Crack Phone Platformer PS2 PS3 PSP Quiz-Puzzle Racing RAPIDSHARE Rip RPG. Airserver download cracked mac.
By lashaziorlashazior. Last updated
This is a comprehensive guide on how to get Scarface working on your PC and what my stream setup looks like. I can only attest this working on Windows 7 but the guide I borrowed this from says it will work on Windows 8, 8.1, and 10. It is not guaranteed to work and I am not responsible for any screw ups you do to your computer. I recommend reading the whole guide throughly to keep yourself from screwing up things.
First off, the issue with Scarface not running on anything other than XP has to do with how it renders pixels. Whatever plugins they used to code the game they are not liked by newer operating systems. The simple fix that we're going to implement here is Wine. For those familiar with Linux systems, you may have or already use Wine. Wine lets you use Windows applications under Linux based systems, but it just so happens that some plugins can be used to re-render Scarface on newer Windows operating systems.
The plugins we are using here are under the Wine D3D for Windows Link. Essentially, they re-render Direct3D pixels through an OpenGL wrapper based on the Wine D3D that Linux uses, allowing us to play Scarface on our newer operating systems.
So how do we get this working? We follow the steps listed below:
1. Make sure your Scarface is on Patch 1.00.2 before attempting these steps.
2. Download Wine D3D for Windows v1.6.2 - Link
3. Locate your Scarface directory (usually it's under c:Program Files (x86)Radical GamesScarface)
4. Copy the contents d3d9.dll, libwine.dll, and wined3d.dll to this directory (the same folder where the Scarface.exe is located). Be sure you do NOT copy the whole zip contents folder over. They have more plugins than what is needed and your game will just not work.
5. If you reached this step, attempt to start your game. If it works, you're golden. Otherwise, you may have to look into other methods such as 3Dripper - Link
If you reached this part and your game is running then you are good to go. If you want to stream this game, read more below.
----------------------------------------------------------------------------------------------------------
If you thought getting this game to work on newer operating systems was bad, getting it to stream is probably worse. For starters, I've only managed to get OBS Classic to recognize a windowed mode of the game. Secondly, it tends to crash at random intervals later in the run, possibly due to memory or graphics overload, I'm not sure which. As such, I recommend saving often.

The first step to getting it streamable is we need to force the game into windowed mode. For this, I highly highly highly (say it with me) HIGHLY! recommend using DXWnd - Link. This nifty program brute forces windowed mode on older games and is just overall pleasant to work with the interface. I actually use this program to force Stronghold Crusader and The Godfather as well, although the latter takes a little more work that I won't go into for this guide.
I'm not going to go into all the gritty details other than linking some settings and explaining a few things.
To start off, we want to add Scarface as a game under DXWnd. File -> Import should load up a folder with a bunch of game names. You can find Scarface in this section. The other option is Edit -> Add to make a blank setting. Here you select your Name you want it to be called (irrelevant, but I call mine the exact game name for completeness sake) and select the Path and Launch .exe. These are still in the same folder as before (usually it's under c:Program Files (x86)Radical GamesScarfaceScarface.exe). Make sure you link directly to the .exe in both Path and Launch.
From here, there's a few different settings we can tinker with. Under the Main tab, which is where we choose Path and Launch, there is settings for resolution and windowed mode. Selecting 'Run in Window' under Generic will force windowed mode and we can change the resolution under position. For instance, I stream my gameplay at 720p, so I set my W and H to 1280 and 720 respectively. With windowed mode selected and these numbers, Scarface will run at 1280x720 in a windowed mode.
BIG IMPORTANT NOTE HERE - if you want OBS Classic to recognize the game in windowed mode, we can't use fullscreen borderless. In other words, your game screen can't be the same resolution as your monitor as modern games allow you to do this for alt-tabbing purposes. OBS Classic can't recognize Scarface in this manner which is why I personally prefer doing a 720p capture.
Most of the other settings in DXWnd are irrelevant for our situation as we mostly just want it to be recognized in OBS and playable. We have the window selection fixed to our liking, now we just need to make sure the mouse stays in the window. To do this, we go under the Input tab, look under Cursor Clipper (second on the left side) and make sure that it is in the ON position. This forces the mouse to stay within the windowed game window like a normal fullscreen game operates.
Once you get your desired window settings and mouse not clipping out of the game window, you should be able to get it recognized in OBS Classic. The source we use here is Window Capture on a specific application. Simply boot your game up, alt tab, and change the settings to get it to show up in OBS. I'm not going to go into details on how to setup OBS as there are much more in depth guides but the simple way I do it is set my base resolution for the stream to my game resolution. This let's me 'Fit to screen' on the window capture source and have a no black borders around my edges.
If you got this far, and your game is working in windowed mode the way you like it and OBS is set up the way you like it, then you probably don't really need to read much further. Except one additional caveat - this game will crash on alt-tab with the Wine D3D and DXWnd method. I'm not exactly sure why but if you are the type of person to alt-tab a lot during gameplay, I recommend you SAVE OFTEN in game. If you are speedrunning this game, save after every mission to alleviate any potential issue with a crash.
Scarface Pc Game
If you have any more specific questions about this guide, feel free to message me personally here on Speedrun.com or send me a Tweet at http://twitter.com/lashazior

0 notes
Text
oh it’s trending because apple is dropping opengl support. from what i’ve heard their opengl drivers were so old and unmaintained or something. i have no clue about what it’s like programming 3d graphics applications on mac with Metal and not sure how possible or likely it is to have someone make a Metal wrapper with a gl and glut-like API but oh well. a lot of game devs are obviously frustrated with the decision. i wonder what gzdoom’s solution to this will be.
3 notes
·
View notes
Text
Writeup: To boldly stumble… pushing the Alliance AT3D to its limits
Yeah. We're venturing into levels of absolute jank that none have ever gone before.
This writeup is also available on The Retro Web!
Fun fact, this card singlehandedly killed Alliance Semiconductor's graphics division! All three successor cards that were planned for release after the AT3D were promptly scrapped and never heard of again. Alliance themselves would crumble not long after, and at present is a shell of its former self that manufactures only memory chips as opposed to… well, mainly memory chips and a few other things on the side.
Aaaaaaanyhow, let's get on with this with a quick spec-dump.
Alliance AT3D - specs
Year released: 1997
Core: Alliance AT3D, unknown manufacturing node, 61MHz
Driver version: 4.10.01.2072
Interface: PCI
PCI device ID: 1142-643D / 0000-0000 (Rev 02)
Mem clock: 61MHz real/effective
Mem bus/type: 4MB 64-bit EDO, 488MB/s bandwidth
ROPs/TMUs/Vertex Shaders/Pixel Shaders/T&L hardware: 1/1/0/0/No
DirectX support: DirectX 6 (DX 3/5)
OpenGL support: - Native: no support - Techland OGL to D3D: --100% OpenGL 1.1 compliant --12% OpenGL 1.2 compliant --0% compliant beyond OpenGL 1.2 --Vendor string:
Vendor: Techland Renderer: Direct3D (display, Primary Display Driver) KNI Version: 1.1 beta 6
OpenGL support (continued) - AltOGL OGL to D3D: --100% OpenGL 1.1 compliant --100% OpenGL 1.2 compliant (but I highly doubt it) --Vendor string:
Vendor: Brian Paul Renderer: altD3D Version: 1.2 Mesa 3.0
As for the rest of the system...
Windows 98 SE w/KernelEX (no updates)
Matsonic MS7102C
Intel Pentium III (Coppermine) @ 750 MHz
256MB PC-133 SDRAM (single stick of Kingston KVR133X64C3/512, can't extract SPD data bc system crashes)
Hitachi 4GB Microdrive
Some random Slot 1 Cooler Master CPU cooler
And with that out of the way, onto the notes!
So. Uh, yeah. The Alliance AT3D, and more specifically this AT3D, is a very... VERY strange card. Despite releasing rather late for a 3D-capable graphics chip in comparison to the competition, the AT3D is very clearly half-baked at best, and a flaming dumpsterfire at worst. I'm not sure if it's the hardware itself or drivers written by the world's worst driver dev team to have ever existed, but there is something very, very wrong with the 3D rendering capabilities that this card has.

As implied by the specs of the card from up above, the AT3D has no native OpenGL support. Or native DirectX 6, for that matter. Windows 98 just happens to like to stamp DX6 support onto cards that don't support anything higher. This card was released targeting Windows 95, and drivers for Windows 98 and up were never made available. Luckily, with how similar the two OSes are at the kernel level, the Win95 drivers are fully-compatible with '98. Yes, that is in spite of the atrocious 3D rendering.
So, anyway. OpenGL. That was what this video was intended to focus on, but between me catching Covid back in August and only finally recovering enough to begin recovering by late September, the video ultimately ended up as this congealed pile of rambling and chaos, with plenty of Windows 98 crashes sprinkled in for flavor.
ClassiCube runs... okay, I guess, if you're willing to look past the amazing rendering quality. AltOGL crashes ClassiCube 1.3.5, and though I've been told that 1.3.6 works with AltOGL, with the AT3D at least it still crashes.
So, instead of AltOGL, I am using Techland's OpenGL to Direct3D wrapper. Though intended to be a "mini-GL" for their Crime Cities game, Techland's wrappers have a decent reputation for speed and compatibility among low-end cards. Though, the AT3D is clearly an outlier on both fronts.
Minecraft itself is unable to launch with either wrapper. With the Techland wrapper, the game complains about not being able to create an OpenGL context, which isn't too surprising given how the Techland wrapper implements only a subset of the OpenGL spec. Slightly more surprising, however, is the fact that AltOGL also fails to allow the game to launch, instead resulting in an instant crash back to desktop. So, while Minecraft proper isn't able to run on the AT3D, I still would say that you could pull off some block game shenanigans with this thing if you're willing to suffer the pain of its rendering hardware screaming in agony.
Other games aside from Minecraft and ClassiCube were tested as well, or at least attempted to be tested, but much like Minecraft itself, they crashed in varying levels of severity with both wrappers. Aside from ClassiCube, the only thing I was successful in running were the 3DMark99 and 2000 benchmarks and demos.
But, this is not where this writeup ends. Oh, no. There's still a completely fresh, unopened can of worms sitting right here on the table for all of us to enjoy.
That can of worms? The card itself and its BIOS.
If you take a look at VGA Legacy MKIII's entry on the Alliance AT3D, you'll find that all of the cards shown on the website are made by a company called "Super Grace". All of them are identical.
My card, however, has a little extra something: a populated 10-pin header.

I'm not 100% sure about what its function is, but just from eyeballing where the traces lead to from the pin header suggests that this may be a header for some kind of optional TV-out add-on board. Perhaps one that outputs composite and/or s-video. It certainly fits in line with the strange video BIOS (vBIOS) that this card comes with. (Also, wow! Matching graphics core, PCB, and memory chip branding! And look at that neato peacock logo from an unknown company!)
So then. The BIOS. This card's BIOS is version 4.30.00 Build 29, whereas the version of BIOS on the Super Grace cards is 2.30.00 Build 29. The differences go beyond just a bump from a 2 to a 4, too; the "version 4" BIOS has a neat animated Alliance Semiconductor logo and banner that slides in from the right, whereas the "version 2" BIOS is a static text box with no logo or anything. However, the lack of animation does also allow the system to complete the bootup process much faster.
Below are a pair of videos demonstrating the difference:
youtube
youtube
Beyond the visual differences, the primary functional difference between the v2 and v4 BIOSes is that the v2 BIOS seems to be completely unaware of the card's seeming TV-out capability. The v4 BIOS, however, is practically hypersensitive to it.
I'll give you a rundown of what happened.
I often use a program called "VCS" to let my Datapath VisionRGB capture card pass video through and act more or less like a regular monitor. The problem, however, is that apparently, when the capture card is hooked up to the AT3D, this causes the EDID (Extended Display Identification Data) that the capture card sends to the AT3D to seemingly identify itself (to the AT3D, at least) as... a TV.
Now, normally this shouldn't be much if any issue. Graphics cards after all are meant to be able to handle this sort of connection. Heck, before the capture card, I was using an actual TV as the monitor for my testbench.
But because weird hardware seems to gravitate towards me, this was not the case for the AT3D. Whatever EDID info the capture card seems to send to the AT3D when VCS is used causes the card to trigger its "TV mode", which enables a TV features tab in the card's graphics settings and disables almost all of the regular settings for refresh rate and screen positioning available in the regular settings tab. Attempting to disable the TV features tab results in a system crash.
The solution to this? Going into the VisionRGB configuration, manually wiping the EDID information, and using OBS as a video-out from the capture card instead after rebooting both the capture system and the testbench. This finally gets the AT3D to recognize the capture card as a regular monitor rather than a TV, and makes the TV features tab go away and unlocks the regular monitor settings.
The v2 BIOS does not do any of this.
I really don't have anything else to say about this card. The AT3D alone is already known for being one of if not THE worst graphics card to ever exist, and the extra sprinkle of weird behavior that my specific card shows is more or less just icing on the terribleness cake. The lack of the ability to really do anything aside from purely-2D tasks makes it hard to have much that can be done with the card in the first place. Though, I guess Baldur's Gate 1 works pretty well since it's a sprite-based isometric 3D game that only uses 2D rendering techniques.
Anyway yeah this card is peak jank lol. Have my card's weird BIOS that doesn't exist anywhere else on the internet. The v2 BIOS is available from VGA Legacy MKIII.
Also the AT3D can run Tux Racer if you slap AltOGL on it. It crashes the whole system if you try and use the Techland wrapper, so the behavior is the polar opposite of ClassiCube 1.3.5.
Update: It appears that the Youtube RetroTechBytes also has a nearly-identical AT3D, peacock logo and all:
youtube
#youtube#techblog#not radioshack#my posts#writeup#Alliance AT3D#To boldly stumble… pushing the Alliance AT3D to its limits
0 notes
Text
Ryujinx is an Experimental Nintendo Switch Emulator written in C# for .NET Core
I love emulators. I love that they exist. I love that we have computers that are not only fast enough to translate and emulate instructions in real time for totally different computers that may not even exist any more but also for computers that are shipping today!
I love these C# based emulators:
CoreBoy is a cross platform GameBoy Emulator written in C# that even does ASCII
Emulating a PlayStation 1 (PSX) entirely with C# and .NET
Today I learned about Ryujinx, an experimental Nintendo Switch Emulator written in C# on .NET Core. The homepage is at https://ryujinx.org/. Emulators are great for learning about how to write and factor great code. Some are certainly "big ball of mud" architecture, but RyuJinx is VERY nice.
Ryujix is particularly cleanly factored with individual projects and modules that really follow the single responsibility principal.
It's written in .NET 5 and you can just git clone it, and go into the Ryujinx folder and "dotnet run," or build from Visual Studio. There are also daily builds on their site.
Some of the impressive features - and again, this is written in C# on cross-platform open source .NET 5:
The CPU emulator, ARMeilleure, emulates an ARMv8 CPU and currently has support for most 64-bit ARMv8 and some of the ARMv7 (and older) instructions, including partial 32-bit support. It translates the ARM code to a custom IR, performs a few optimizations, and turns that into x86 code.
The GPU emulator emulates the Switch's Maxwell GPU using the OpenGL API (version 4.4 minimum) through a custom build of OpenTK.
Xinput-compatible controllers are supported natively; other controllers can be supported with the help of Xinput wrappers such as x360ce.
Most emulators are created for educational and experimental purposes, so don't look to be using this for nefarious purposes. This is a fantastic codebase to explore and experiment with.
Using a computer is like riding in a Lyft. Writing an Emulator is like disassembling an internal combustion engine and putting it back together differently and it still works. It won't make you a better person but it will make you appreciate your Lyft.
Sponsor: Simplify code, reduce costs, and build faster without compromising the transactionality, scale and flexibility you need. Fauna - a global serverless database for all your applications. Learn more!
© 2021 Scott Hanselman. All rights reserved.
Ryujinx is an Experimental Nintendo Switch Emulator written in C# for .NET Core published first on http://7elementswd.tumblr.com/
0 notes