- cross-posted to:
- games@lemmy.world
- cross-posted to:
- games@lemmy.world
You can play it in your browser here.
On another forum, I was complaining about how Microsoft was planning to remove WordPad from Win11. I was advised that installing OpenOffice or LibreOffice was an appropriate replacement. I replied that WordPad was only 3 megs large, as opposed to the recommended replacements, which are decidedly larger.
I guess not everybody appreciates tight code, but I surely do. Things like this are amazingly impressive.
Appreciates tight code
Runs Windows 11
Pick a lane, son.
Appreciates tight code
Proceed to run a 13KB Javascript game in a browser.
Brother, look up kolibri os, then you’ll see some TIGHT code
AmigaOS 1.3 or bust.
I will die on this hill.
Anyway don’t install OpenOffice for any reason, just pick libreoffice or onlyoffice. OpenOffice doesn’t get a functional/security/compatibility update since 2014.
I just looked at how big LibreOffice Writer is, 210 MB as a portable app… Wow…
AbiWord Portable is probably the smallest and even that is 15 MB installed…
I was gonna say notepad but I just looked and its 18mb. granted I have a few plugins installed though.
I don’t particularly care about code size as a user or as a programmer.
Hard drive space is the cheapest thing you’ve got on a computer.
You could always run gentoo and use -Os … that can make things a lot smaller but also slower.
Hard drive space is the cheapest thing you’ve got on a computer.
I hate this “storage is cheap” mentality, it’s a cop out for being wasteful without a reason. “Gas is cheap” was common up to the early 1970s, until it wasn’t anymore. “Freshwater is cheap”, until it isn’t anymore.
It’s an invented problem. A program takes what a program takes. Everyone cares way more about the code being legible, the code being fast enough, and the code not using a ton of memory (and even that last one is kind of shrugged off depending on context).
Applications taking 3mb take 3mb because they do next to nothing or they do it with a bunch of shared libraries … which is a whole other dependency management mess and wasting a few mb on a drive.
There’s also a huge difference between being wasteful of something that pollutes the planet in mass and is not renewable like gasoline (which is the only reason you’d be upset about that now) and wasting a few mb on a drive.
The equivalent of your complaint 3mb vs 200mb is like complaining about a person taking a trip to the grocery store… It’s insignificant and often necessary.
You can say that program does way more than you need, but … nobody is catering to “only what you specifically need” and using the larger program almost certainly covers your needs.
Furthermore, like I already said making things smaller often makes them slower… Since CPU is more expensive to improve, of course things are bigger, that’s what more people care about. Some video games take that to an extreme with uncompressed files and 250GB install footprints … but 200mb?
Everyone cares way more about the code being legible, the code being fast enough, and the code not using a ton of memory (and even that last one is kind of shrugged off depending on context).
And then you look at real life and notice that code everywhere is slow, bloated and inefficient. But hey, it’s “legible”! To one or two devs, hopefully.
The equivalent of your complaint 3mb vs 200mb is like complaining about a person taking a trip to the grocery store
Terrible analogy. A better equivalent is someone renting a garage to store stuff inside and now, because they have so much space, there’s that urge to fill it, whether it makes sense to or not.
making things smaller often makes them slower
It’s usually the other way around. As a rule of thumb, less code = smaller size = faster execution. In theory, 1k lines of code will require less computation, less processing, than 10k.
And then you look at real life and notice that code everywhere is slow, bloated and inefficient.
That’s not true in practice. I mean, that code does exist. However, the vast majority of code is reasonably performant.
Not everyone is an expert at optimization and that’s fine … we’d have a lot less software in general if only the best of the best were allowed to author it.
It would be great if more things went back to native (or at least not “I need an entire web browser for my app to function”) that to me is wasteful… But a few hundred MBs for a program as large, complicated, and feature rich as LibreOffice is not.
Terrible analogy. A better equivalent is someone renting a garage to store stuff inside and now, because they have so much space, there’s that urge to fill it, whether it makes sense to or not.
No, that’s … just wrong. It’s not like people are just writing code and leaving it there to do nothing except increase code size or are actively trying to fill the drive.
It’s usually the other way around. As a rule of thumb, less code = smaller size = faster execution. In theory, 1k lines of code will require less computation, less processing, than 10k.
That’s not inherently true, though it is a common misconception/oversimplification. When you do things like code inlining, you increase code size (because you’re taking that functions code and having your compiler copy it around to a bunch of places) but the increased locality speeds things up. There’s a reason -Os and -O3 are not the same option.
Now sure, if you execute fewer instructions that’s better than executing more localized code (though even that can be wrong given process cache and relative instruction speed). Lots of programs have added features that you might not use, but that doesn’t really “hurt you”, that’s not the source of your program or your computer’s slowness, it’s just some bytes on the drive.
We’re a long way from the Unix style “everything is a small program that gets piped into other programs to do interesting things” days. That paradigm just doesn’t work for GUI software. Nobody does that because … normal folks would rather have one office program than have to go shop for 275 programs so that they can have separate programs to edit the document, print the document, convert the document to pdf, update calculations in their spreadsheet, run macros, etc (which if you use all/most of them would likely be more expensive in terms of disk space anyways).
Reminds me of kkreiger, an fps with impressive graphics and sound for the time, that weighed in at 96KB.
A long time ago I came across a game that was part of a 1mb challenge. It’s called A New Zero. I played it quite a lot, just flying around and dive bombing boats was entertaining enough for me.
I was impressed with 1mb but 13kb and 96kb is pretty amazing. I really enjoy seeing stuff like this.
kkreiger is more impressive to me, because it creates itself on execution time. While this 13kB game is willfully ignoring the fact that the average web browser today is already a 2GB behemoth. While kkeiger is pure C++ and it does the whole thing, including the game engine and sound processor and everything else.
Is definitely not pure c++. It’s making use of direct x and even fonts available in Windows to create textures.
Without DirectX or OpenGL you’d have to create a GPU driver or do CPU rendering…
kKrieger was always kind of amazing to see. Even understanding a little bit about how the game works, it’s still kind of mind-boggling
This is crazy!
I have so many questions, but lack the technical know how of how to ask them.
Instead of actually storing images, sound files, maps, etc, whole program relies on algorithms computed at runtime. Level generated automatically, sound follows a set math pattern with randomization, etc etc
Benefits of less file size but more processor requirements
Basically it’s all procedurally generated assets. It takes a long time to start to because it has to make all it’s textures and sound effects.
Check this video out by Nostalgia Nerd
How much of that is third-party libraries, and/or third-party hosted? Obviously the assets (images and music) aren’t being counted.
The whole page transferred about 7kB and shows 18.2kB of ressources according to the debug tools.
The game also requires a renderer (browser) to play.
I think what they did is impressive but the claim about the size feels like taking source code and saying “look how small on disk it is”
It is js, it is always source code.
And I was blown away the ps1 game Vagrant Story turned Into 90mb file as a chd.
That said one of the games I enjoyed most even when I had an Amiga was a 48k ZX Spectrum game called Chaos.
I did not know that I needed loderunner-quake in my life.
Thanks for posting.
Online multiplayer?
Ok that is an impressive number but it feels a little disingenuous. You still need to something on your machine to interpret the js code, right? Is that included in the 13k? How much storage does that take?
EDIT: Well this is by far my most negative comment here. That’s almost entertaining. I’ll share a few more of my thoughts here rather than respond to individual comments. Maybe the context will make this more palatable.
First, I expect that the js language is doing most of the work here. Which makes sense. But having a browser installed as a prerequisite is an enormous dependency.
How would that stack up against other languages? Can I build a 13k binary using C? How about C#? I think Go is maybe the most interesting because the binary is entirely self contained by default. No external dependencies aside from the OS. I don’t think this or a similar game is viable with only 13k. Which is fine! I just that I find 13k is disingenuous.
That brings up the question of whether or not we should include the OS in the storage size. I would think not. But that’s only because the OS is (usually) the least common denominator when we talk about developing software. It’s generally assumed by default. But if someone wants to compare with a game that interfaces with hardware directly, then yes, we should absolutely include the OS as a dependency.
Now that I’m giving this more thought, I suspect that the devs wrote 13k of code + assets to make the game functional. Still impressive. But the more I think about this, the more meaningless that number gets. Does pre or post compiling matter more? What if we compress the thing as tarball? There’s just too many ways to manipulate this number.
Should the machine’s operating system be calculated in the storage too?
Is there a competition for smallest bootable 3D FPS?
Your bios screen could be if you’re brave enough.
Potentially. See my edit shove
That’s irrelevant to the contest.
Depending on how “pure” you want to get, you’d have to look into games that play from boot, so not unlike stuff you’d get from the SNES and older consoles.
Everybody has a browser that runs JS. Only 13k has to be transmitted via floppy disk
deleted by creator
you’re one of those people who think that Windows XP shouldn’t require Internet Explorer to run.
I am not
Removed by mod