AI BREAKS NES TETRIS! - 102 MILLION and level 237

Retro

Founder
Staff member
Joined
4 Jun 2021
Messages
2,047 (3.76/day)
Superhuman doesn't even begin to describe it. :cool: The AI goes far beyond what any human could ever hope to do.

The AI programmer, Greg Cannon, has done a great job of beating Tetris on the NES - by eventually crashing the game! Check it out.




And here's a little extra from the official Tetris site:

 

Arantor

Well-known member
Staff member
Joined
24 May 2022
Messages
713 (3.77/day)
I think I'm actually more fascinated by the bug behaviours than I am by the AI itself here ;)
 

Retro

Founder
Staff member
Joined
4 Jun 2021
Messages
2,047 (3.76/day)
Yeah, they're pretty interesting. I'd like to see how Tetris implemented on the PC, with a vastly increased capacity and speed would behave with this AI.
 

Arantor

Well-known member
Staff member
Joined
24 May 2022
Messages
713 (3.77/day)
I would expect it comes down to how literal the port is.

When porting, I've seen ports that go for original-exact replicating every behaviour down to the frame, which I'd expect for most of the behaviours in a PC port suitable for battle-testing an AI on. For example the actual scoring mechanics, the exact circumstances for moving levels, the point at which the speed changes and the handling of inputs vs individual frames (because this matters)

As far as score computation is concerned, where the crash occurs, I could see an original-exact port being funny about this because a perfect port might strive to replicate the bugs too, including creeping out of the sprite sheet to draw scores beginning with A etc. and if the port replicates that for accuracy, it'll likely replicate the scoring bug that crashes it.

The part I don't get is why the scoring bug happens, though I'm willing to concede I'm not a NES dev! The behaviour in the video showed gameplay getting progressively slower - but I'm not sure I understand why. According to the Fandom wiki for Tetris, you get points for soft-drop (i.e. accelerating the piece downward as we see the AI doing before it snaps into place; it never hard-drops from what I could see), and beyond the early levels this is scaled up.

The thing is, this doesn't scale with level so it should be a constant time calculation; I could see it not being constant time initially as it might scale with score but... even in the worst case, adding points to an 8 digit number should remain a constant-time activity relative to other 8 digit numbers (e.g. it's no more computationally expensive adding to 10000000 as it is to 89999999, but we saw it progressively slow down the further it went)

I'd get the 4-line clear breaking it though, if the time taken to process that calculation (which is higher than others because it does scale with level) doesn't fit inside one frame, though I'm not entirely sure I understand when that wouldn't happen.

I dunno... I'm not a NES dev, but I have done assembly on comparable hardware (ZX Spectrum, and very briefly on the BBC Master which even uses the 6502 processor!) and I struggle to imagine what's going on here.

But if you put aside the crash bug in a port, I imagine the PC version would be able to carry on forever assuming no streak of bad luck so long it can't recover from it. The amount of time it seemed to be running at the enhanced speed didn't seem to be any real blocker for it, and it even recovers gracefully from a shortage of bars (which was very noticeable from the killscreen stats)
 

Retro

Founder
Staff member
Joined
4 Jun 2021
Messages
2,047 (3.76/day)
I wasn't thinking of a "bit for bit" port like you described, although it does have its place, especially for emulators. I had in mind the program written from scratch to take advantage of a modern PCs capabilities in terms of graphics, CPU processing, storage etc, with the gameplay being kept substantially the same, including the playing speed.

Of course, nothing has infinite capacity, so I'd expect a correctly implemented game without bugs to keep going forever, which isn't that hard to do, but the score would display properly up to its maximum value and then wrap around through zero without crashing or doing other funky stuff like those slowdowns.

Now, how big that score can get is more or less arbitrary on a modern PC, so presumably it could take hundreds or even thousands of years if the score counter has enough bits. An overflow flag could be set when it wraps over and the time it took logged for example, so one knows it's happened. Of course, the hardware would never run for that long and if it somehow did, the original programmer and their friends would never see it as they'd be long gone...

Why that NES hardware behaves like that, I have no idea either. I'm sure if we could get hold of the technical reference manual for it and look in detail at the code we could work out at least some of what's going on there. I'll bet a lot of those funky effects on the NES are due to simple boundary overflow check bugs that were not bothered being trapped, because no human could ever get up there to trigger them, which is reasonable for a commercial product like this.

@TheURL this programmy stuff will be right up your street. ;)
 

Retro

Founder
Staff member
Joined
4 Jun 2021
Messages
2,047 (3.76/day)
Thinking about the slowdown, it's possible that the NES kept some kind of data stack that was then scanned through as the pieces fell. As it got bigger, the scan would take longer and hence the gameplay slows down. Just one plausible explanation.
 

Arantor

Well-known member
Staff member
Joined
24 May 2022
Messages
713 (3.77/day)
From a graphical perspective, the AI wouldn't care at all - it's looking to match what it knows the screen looks like (e.g. the next piece shown) with what it knows about the game, so whether it's technicolour or NES colour, it doesn't matter to an AI.

But keeping the gameplay identical down to the frame by frame behaviours should be relatively possible without too much drama.

I suppose the biggest question is at what point you start running into limits in counting; a 4 byte (32 bit) number holds a score of 4 billion without loss of accuracy, an 8 byte (64 bit) number gets you to 18,446,744,073,709,551,615 without loss of precision. 16 byte number (128 bit) gets you to something north of 3*10^38 (imagine a 3 with 38 zeroes after it) without loss of precision, and I think you're more at risk of the game taking an exponential amount of time to reach those sorts of scores, but it's fascinating to think about.

The question really becomes at what point you're prepared to change the rules of the game to see what the limits really are - the speed of the game at this level is bounded primarily on rendering and input speed; the AI doesn't get access to the internals of the game to see what the next piece is, it has to wait for that first frame showing it. Similarly, it's bounded by the input resolution of - what was it, 12 Hz? That's the maximum speed input is *allowed* into the game, but there's no reason why you have to keep these metrics bounded to actual time.

If you decouple it from *actual* time and simply do it as a series of frames that can be reviewed/played back at any speed, your ability to crunch frames is now significantly faster because you don't have to wait for the time to occur between frames for the frames to happen.

You could model 'here are the 30 frames in second 001, with the following inputs being applied on these frames' and play them back in real time if you wanted, at which point you could let it run as fast as it can be computed; the 30 frames for second 001 might not take a second to be calculated especially with modern computing hardware.

I'm reminded of some of the stuff in the DOOM speedrun arena where the runs are recorded in such a way that the states and keypresses per frame are recorded so they can be played back perfectly and watched for signs of tampering - there's no reason something similar couldn't be done for Tetris, but it looks cooler to do it with a simulated NES ;)
 
Top Bottom