Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Games IT

Rockstar Pays $10,000 To Modder Who Fixed GTA Online Loading Times (gamesindustry.biz) 72

Rockstar Games has paid a modder $10,000 for identifying a way to make Grand Theft Auto Online load significantly faster. From a report: The modder, who goes by the handle 't0st,' recently posted their discovery of a single-thread CPU bottleneck that occurs in the PC version of the hit multiplayer mode. They created a fix they claim enables the game to load 70% faster, and included a message for Rockstar, advising that the issue "shouldn't take more than a day for a single dev to solve." Reports spread of t0st's discovery and Rockstar has confirmed not only that this works, but that it will release an official fix in a future update for the game. In a statement to PC Gamer, the company said: "After a thorough investigation, we can confirm that player t0st did, in fact, reveal an aspect of the game code related to load times for the PC version of GTA Online that could be improved. "As a result of these investigations, we have made some changes that will be implemented in a forthcoming title update."
This discussion has been archived. No new comments can be posted.

Rockstar Pays $10,000 To Modder Who Fixed GTA Online Loading Times

Comments Filter:
  • by Anonymous Coward
    I don't have to RTFA do I?
    • The article links to his blog post [nee.lv] which is definitely worth reading. It goes over the specifics of what he uncovered and how he fixed it.

  • by Pierre Pants ( 6554598 ) on Tuesday March 16, 2021 @02:38PM (#61165686)
    Diverting the bad publicity of such a huge problem being solved by a modder instead of the company into "Modder gets a fat reward" is the best possible way to handle this.
  • This is interesting (Score:5, Interesting)

    by Dutch Gun ( 899105 ) on Tuesday March 16, 2021 @02:45PM (#61165710)

    I actually dug into this modder's report, and what was most impressive to me was the reverse-engineering he did to isolate and identify the problem, obviously without access to any source.

    If I remember correctly, the problem itself was pretty mundane... Rockstar was apparently loading a crap-ton of JSON files (God, why?), which is going to perform horribly in C++ unless you do something a bit more optimal with string allocations (game engines almost always preprocess stuff like that for just that reason). Or something close to that I think. It sort of demonstrated to me that Rockstar never even bothered to fire up a profiler to see what was taking so long in their loading code. Frankly, it should have been blazingly obvious, given they have the source. So it's a bit of egg on their face.

    Good on them for giving this guy props and some money. That's definitely the way to handle this sort of thing. And hopefully they've learned a bit of a lesson about picking their own low-hanging fruit. And as a gamedev myself, a good reminder for myself never to take such things for granted.

    • Rockstar never even bothered to fire up a profiler to see what was taking so long in their loading code. Frankly, it should have been blazingly obvious, given they have the source. So it's a bit of egg on their face.

      Nah, to them it's not "taking so long", it's "providing an opportunity to advertise sales and Shark Cards to our players". Now, if they can get around to minimizing the time I get to play the Los Santos Cloud Watcher minigame in between every job, that'd be great.

    • by raymorris ( 2726007 ) on Tuesday March 16, 2021 @03:01PM (#61165756) Journal

      > It sort of demonstrated to me that Rockstar never even bothered to fire up a profiler to see what was taking so long in their loading code.

      At one job I had, in my first week there I overheard that customers were complaining the software was too slow.
      Around the beginning of the second week, I piped up in a team meeting and said I could probably make it 20%-30%, since I didn't have anything else to work in yet. They chuckled and told me several of their senior devs has already done all they could to make it faster. They thought I was full of shit when I said "give me a week and let's see what I can do. I haven't looked at the code yet, but I bet I can cut the time by 20%-40%". I got the go-ahead, and the subtext was "go ahead, smarty-pants, we're looking forward to watching you faceplant on this."

      I could say that with reasonable confidence because I know hardly anyone use a darn profiler. In pretty much all software I've ever looked at, at least 30% of the time is spent doing something that can easily be done much faster.

      I did tell my wife I might have made a mistake, my mouth wrote a check that my ass might not be able to cash.

      Sure enough, a week later it was 30% faster. Because I used a f*ing profiler to see exactly where it was spending all of time. Turnout some thought into how to do the slowest part much faster. I had established technical credibility.

      A month or two later, the databases were getting overwhelmed.
      The database admins didn't have any good solutions.
      So I spent a day to take 25% of the load off the SQL servers and make them run faster - by just running a damn profiler on the slow queries.

      A few weeks in, the (complicated) software ran twice as fast.
      I had eablished myself as a rockstar in their minds. Just because I know how to use a profiler, which isn't difficult.

      For some reason, darn few devs and admins ever use a profiler.

      • Just because I know how to use a profiler, which isn't difficult.

        Yes isn't it crazy how no-one hardly ever uses a profiler? And like you say, it's so easy to do, you can just do one run and often find something interesting that you don't realize was a bottleneck or overly slow.

        Now without access to source, that is a much more impressive effort, and I thought that guy should have got more like $100k than $10k... and Rockstar should have hired him to do only performance tuning as a full-time job.

        • Actually, there are profilers that work with executables without any source code. Valgrind, for example. You run the program then take samples of the stack while it's running. You don't get every single tiny call, but this is usually not a big problem because you're actually looking for stuff that comes up in the stack a lot.

          Once you have the stack addresses of where the slowdown occurs, you just run a disassembler and start poking through the code just like you would with source code. Takes more time, but

        • by Falos ( 2905315 )

          >>Rockstar should have hired him to do only performance tuning as a full-time job.

          Nah, need that $100k to come up with 20 in-game shirt items every year.

          Never mind that the players never give a fuck about them, or that 20 top-shelf well-made skins would take a mod community one week and $500 in shitty prizes.

          Cutting down on the fuckload of bloat we're gonna ignore? Why? We can ignore it. It runs fine on our "mid range" testing machine.

        • Not only that, they will tell you that you're crazy for using a profiler.

      • by ranton ( 36917 ) on Tuesday March 16, 2021 @03:30PM (#61165846)

        They thought I was full of shit when I said "give me a week and let's see what I can do. I haven't looked at the code yet, but I bet I can cut the time by 20%-40%".

        The one week estimate represents most of the risk here. Almost any code can be sped up, if for no other reason that it is not a good idea to prematurely optimize code. Once speed degradation happens you should have a good clue of where to go and what to fix. But some code bases can be a big bundle of mud, or even worse reliant on some third party app or similar black box. In those cases you could have a significant problem meeting such an aggressive timeline.

        • They thought I was full of shit when I said "give me a week and let's see what I can do. I haven't looked at the code yet, but I bet I can cut the time by 20%-40%".

          The one week estimate represents most of the risk here. Almost any code can be sped up, if for no other reason that it is not a good idea to prematurely optimize code. Once speed degradation happens ...

          How do you detect speed degradations without profiling? Wait until something is so bad it causes IO timeouts or users complain? User performance complaints work like this: User - It's slow. Developer - Yes.

          I've always understood premature optimization to mean optimizing before you've looked at any profiling data. That's no reason to delay profiling. It's like delaying testing because it's too soon to start fixing bugs. You do performance profiling like you test, you're not supposed to wait until some

          • by Anonymous Coward

            The reason for avoiding premature profiling is because, yes, that part may not be optimal, but it may be (and always be) good enough. So if you spend real, measurable development time ensuring every piece of code is as optimal as possible, then most of the time you won't have a reputation for always producing the fastest code, you'll just be the slowest developer.

            So you go half way - avoid obviously bad algorithms (at least when they're going to matter. O(n^2) doesn't matter if n is guaranteed to be no mo

      • by SirSlud ( 67381 ) on Tuesday March 16, 2021 @03:31PM (#61165858) Homepage

        We use them all the time on AAA studio games. CPU profilers. GPU profilers. Multiple profilers (1st party and 3rd party profilers) per platform. But see my post above why we tend to profile extensively on the consoles, and not as much on PC. Frankly, we probably miss stuff on PC specifically because we profile so much on the fixed hardware platforms.

        • my question is why the PC version (with its higher theoretical performance) was having performance issues and the "slow" consoles were performing better. it seems like whatever code was parsing the JSON on the consoles could have been used on the PC version.

          this says that they are not sharing enough code between the different platforms as they should, causing more bugs that there should been.

          • The problem was in a standard C library and the way sscanf was implemented. It's quite possible that the version they used for consoles didn't have the same problem. There's no inherent need for sscanf to call strlen. It's just that this is a side effect of some code re-use. The string is treated as a file so fscanf can be used, and the routine that does that calls strlen.
        • We use them all the time on AAA studio games

          Except GTA obviously...

      • Hahaha, there's no way I would have predicted how much time I'd cut out up front. Definitely ballsy. But you're totally right... Most devs don't use profilers. And for some insane reason, they'll simply make assumptions about where they *think* the hotspots are instead of measuring. If you're using Visual Studio, these days, there's no excuse, as there's an excellent sampling profiler build right in.

        But honestly (and maybe I'm giving game developers too much credit here), game developers are not really

        • Plenty of games with high performance engines are running with clunky shells. By that I mean the code that runs the graphics may be fast, but the code that runs the UI is probably optimized for maintainability rather than performance. If something slow creeps in here, it might not get the attention something would if it affected game time.

          • I suppose that may be true for some games, but the ones I've worked on, everything was monitored pretty closely. I mean, no one wants the game's runtime performance and the hard work getting it there clobbered by some sloppy UI code.

            I guess it's hard to generalize with that sort of thing... I can really only extrapolate based on my own experience. And it's not like any one game developer really works on very many games even in their entire career versus the numbers that are released each and every year.

      • by edwdig ( 47888 )

        I suspect the issue here isn't that they didn't run a profiler, but that they didn't run it on the right data. The live version of the game had tends of thousands of items in the JSON data. The programmers were almost certainly using smaller data files to make their daily work go faster. They'd rarely need every item in the game for their normal work, so I'd expect they normally just used a subset of it to speed things up.

        The issue here was two O(n^2) operations in the parsing, neither of which was obvious.

      • by gmack ( 197796 )

        Lucky you. I once worked at a company who was contracted to build a multiplayer bingo server. It had a 400 user limit per server/db cluster and I pointed out that they could make that preform much better and I was told to mind my own business since that wasn't my project. I was eventually forced out.

      • I had established myself as a rockstar in their minds. Just because I know how to use a profiler, which isn't difficult.

        For some reason, darn few devs and admins ever use a profiler.

        I pretty much did the same thing over the course of my career using Google. I still don't understand why so many people can't work it out :)

      • This is the first time I've heard of a profiler. Now I'm going to look up what it is and see what I can do with it.
      • On IBM mainframes there was GTF and TEST. There were heaps of compiler options that nobody ever learnt.There were hoots and exits on getmain and freemains - so memory leaks were never an issue. Then along came a product called STROBE, which took regular timed snapshots of the PSW so you knew where most time was being spent Then extended to DB2 to catch tablescans. But you can see why MS would NOT like a product like that. The product was expensive. Amazingly the developers rarely used it, and talented engin
      • Not surpised that you could make those improvements. No surprises either on the low usage of profilers. The reason to both is that through my career in IT I've found most people to be either mediocre or worse. Few people even know what a profiler is, few people even understand what the snazzy frameworks they use to build their apps do in the background. They just copy some examples from the Internet or copy and paste code from other parts of the application.
        It's true that you don't have to be an expert for
    • by SirSlud ( 67381 )

      AAA studios shipping across consoles with a PC sku tend not to do a lot of profiling on PC for the reason that the hardware isn't fixed and thus the return on investment isn't as high as identifying bottlenecks on consoles that will happen uniformly to everyone. It's kind of easy just to make the broad but generally correct assumption, "If it performs well enough on consoles, consumer PCs are generally min spec'd around the same level or frequently more powerful than consoles."

      But sometimes maybe some threa

    • It wasn't string allocations or a C++ problem, it was a C problem. They were using scsanf to parse one token at a time, and in each invocation it called strlen, so the parsing algorithm was quadratic. If they'd have used c++ streams and >> the problem would not have occurred.

      • Ah, thank you for the correction. I was thinking maybe I had the details wrong, and certainly did.

        For the record, though, you do have to be careful in C++ if you're parsing and creating a huge number of std::strings that are too large for small-string optimizations. I've gained an order of magnitude speed improvement by writing specialized allocators to avoid that issue on one particular tool that had to process a LOT of text. So I was probably partially remembering my own experience with C++ text parsin

        • Oh yeah no doubt. C++ is peculiarly awful at string processing in its standard setup and it's easy to lose a lot of performance. On the other hand it sounds like they weren't processing a lot of text, just enough to make a quadratic cost parser go very very slowly.

      • by Entrope ( 68843 )

        If they had used simdjson [slashdot.org] (Apache 2.0 licensed), they would parse that JSON so fast they'd think it finished parsing before it started.

        • That too.

          There's quite a lot of JSON libraries out there, which are written, debugged and easy to use. SIMD JSON is certainly the fastest, but I strongly suspect that any of them would do given that I doubt that any of them are quadratic cost. The real WTF is that they hand rolled their own.

      • > If they'd have used c++ streams

        C++'s iostreams are shit for performance.

        Smart game devs are using scnlib [github.com] or fmt [github.com] as replacements for the garbage C++ iostream and C scanf implementations.

        • C++'s iostreams are shit for performance.

          Who cares? It would likely have been fast enough that it wasn't a bottleneck, given that fixing the quadratic cost scanf version basically made the cost disappear.

          • Your attitude is how we end with this exact scenario. A programmer that actually cared sped up load time by 70% without source because of morons like you who don't care about performance.

            Just because you are ignorant and apathetic about performance doesn't imply other people don't care.

            Stop fucking wasting the user's time.

            • Unlike you I know how to run a profiler so I don't waste time. I also know how to unpick the sort of colossal mess you make in the name of speed. And it's a job which pays well.

              • > Unlike you I know how to run a profiler

                How do you think people found out that iostream was shit in the first place, dumbass?

                > how to unpick the sort of colossal mess you make

                I write clean code. If code is a mess then someone is a noob.

                Data-Orientated Design is far simpler to read AND write then the garbage OOP Crap++ that morons like you write with your over-engineered solutions of deep hierarchies, excessive virtual functions, extreme coupling, Design Pattern shit, all because you are too fucking

                • How do you think people found out that iostream was shit in the first place, dumbass?

                  You see this makes me believe yet further that you are a bad programmer, because you think in such simplistic terms. It's a question of "good enough for the task". You see I know this because unlike you I know how to profile my code. Usually iostreams is not a measurable bottleneck, so it's not worth the time and maintenance burden to replace. Sometimes it is and it's worth replacing.

                  Yeah vast, vast amounts of "clean" code

        • Wouldn't matter too much in this specific case though. However shit they are they're going to be better than an O(n^2) algorithm.
          • Don't you know the STL is slow? How bad of a programmer are you??? EVERYONE KNOWS IT!

            You have to use my 10,000 lines of carefully crafted "clean" code which maybe saves 1ms on game load (not that I measured). It's how I stay employed you know.

            Seriously though I have to deal with people like him regularly and it is so tedious. It's like they want me to prove elementary software engineering to them every single time.

            • Not sure if stupid or if trolling. /s

              EASTL [github.com] exists because at one time STL had many problems [open-std.org].

              Depending on the problem the Standard C++ Lib ranges [ctfassets.net] from quite good to complete shit. For example absl [abseil.io] and parallel-hashmap [github.com] blows Crap++ hash maps out of the water.

              > 10,000 lines of carefully crafted "clean" code

              Only a complete retard uses LoC without context as a metric.

              • It's funny. You are so close to enlightenment yet so far from understanding.

                [Snip examples and links]

                Yes exactly, but you know how those people made their case? They performed measurements on their own use case and found results where the

                In fact one ends with the line "Write benchmarks, try different things, find your own best".

                You are not doing that, you're just cargo culting. You saw a place where the STL wasn't very good and now you're just spewing "teh stl si teh sux0rzomgwtfbbq!111" all over the thread

    • > Rockstar was apparently loading a crap-ton of JSON files (God, why?)

      It was a single 10 MB JSON file with 63k unique hash entries of the in-game store (not MTX) that was being parsed and hashed.

      From the actual blog [nee.lv] details are:

      A whopping 10 megabytes worth of JSON with some 63k item entries.

      {
      "key": "WP_WCT_TINT_21_t2_v9_n2",
      "price": 45000,
      "statName": "CHAR_KIT_FM_PURCHASE20",
      "storageType": "BITFIELD",
      "bitShift": 7,

  • Unfortunately for the modder, the money was in the form of a Shark Credit Card.

"The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts." -- Bertrand Russell

Working...