Java Program Uses Neural Networks To Monitor Games 100
tr0p writes "Java developers have used the open source Neuroph neural network framework to monitor video game players while they play and then provide helpful situational awareness, such as audio queues when a power-up is ready or on-the-fly macros for combo attacks. The developers have published an article describing many of the technical details of their implementation. 'There are two different types of neural networks used by DotA AutoScript. The first type is a simple binary image classifier. It uses Neuroph's "Multi-Layer Perceptron" class to model a neural network with an input neurons layer, one hidden neurons layer, and an output neurons layer. Exposing an image to the input layer neurons causes the output layer neurons to produce the probability of a match for each of the images it has been trained to identify; one trained image per output neuron.'"
Can it.... (Score:3, Funny)
Ohhh, can it tell me when to move and shoot as well? Hey then interface it with the keyboard and mouse inputs and all my games can play themselves (like masturbation for computers).
Then I can do other things while having fun playing games.
Cheating? (Score:1)
Ohhh, can it tell me when to move and shoot as well?
Maybe not now. Currently its only for DotA. But it might easily be extended to other games as you say. If so, won't cheating be easy in multiplayer games? How can that be prevented?
Re: (Score:2, Insightful)
Don't play against jackasses. Makes public servers a bit harder to deal with, but it is an easy solution otherwise.
Re: (Score:2)
Ohhh, can it tell me when to move and shoot as well?
It's emulating the human brain in a VM.
My equivalent implementation: while(true) {sleep(HALF_AN_HOUR); printf("You need to respawn.");}
Re: (Score:3, Funny)
Re: (Score:2)
It can also lock you in out of the airlocks, but only by accident.
Sorry, I'm just a bit upset that it's 2009, and our biggest problem with computers is still that they just aren't smart enough.
Re: (Score:2, Interesting)
My program had no capacity to play the game with a different interface than a human. It actually read the values of the pixels on the monitor, processe
Re: (Score:1)
1) Playing on lives didn't work because eventually a network would destroy the boss' offensive capabilities and hide in a corner. The game would never progress.
Why on earth would that poor guy want to get to a new level for more hurt? He's a lover, not a fighter! Why can't we all just get along?
Re: (Score:1)
Re: (Score:1)
WF is score based, track that.
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Ohhh, can it tell me when to move and shoot as well?
I know this is a joke, but you've clearly never played DotA :) It has the most advanced gameplay of literally any game I've ever played in all my years of gaming. Saving me a couple of keystrokes is all this program does. I've been using it for a couple weeks now. While it is helpful, it's hardly playing for me.
Re: (Score:2)
Nope, infact, I don't even know what it stands for, the site linked doesn't tell me either, I assume from the links all over the site its some WoW x-pac?
Huh. (Score:5, Informative)
Probably no one cares, but that's the wrong "queues" there. They mean "cues."
Re: (Score:3, Funny)
Probably no one cares, but that's the wrong "queues" there. They mean "cues."
I was wondering why I had trouble parsing that sentence, but didn't spot the reason. Thanks. :)
Re: (Score:2)
Re:Huh. (Score:4, Funny)
Probably no one cares, but thats the wrong "queues" their. They mean "cues."
Broke that for you.
Re: (Score:2)
HA!
It's not Slashdot's most annoying meme, but an incredible imitation!
Well done, sir. Well and truly done.
Re: (Score:2)
The '90-ties are over (Score:2, Informative)
Why on earth are people still wasting their time on Neural Networks? Sure, they have a catchy name, but everything else about them sucks. Today we have much more robust methods available, e.g. Relevance Vector Machines, etc.
Re: (Score:3, Funny)
Re: (Score:1)
Easy to implement. First thing that seems plausible that gets taught in a machine learning course. And sometimes, like, say... HERE.... it works pretty well!
Re: (Score:2)
It's open source, it's gonna take a while to catch up :p
Nehru netowrks? (Score:1)
I thought that went out with the 60s.
Hilarious Overkill (Score:5, Insightful)
So they designed and wrote a neural network for the sole purpose of identifying a limited set of icons? Seriously?
They could have done this using conventional methods that would be significantly faster. Me thinks someone was just doing this for entertainment.
Re: (Score:1)
Re:Hilarious Overkill (Score:5, Insightful)
multi resolution analysis perhaps? an example of this method is wavelet decomposition.
Which is even more processor-intensive than a moderately sized neural net.
How about NO image recognition? (Score:2)
I don't know what the GP had in mind, but for my take: How about _no_ image recognition in the first place?
If you need to see when a game icon is activated, how about just looking at the byte that stores the state for that icon?
Re:How about NO image recognition? (Score:5, Informative)
Re:How about NO image recognition? (Score:5, Insightful)
As someone who's been writing external trainers for games for years (though admittedly it was some years ago), I can assure you first hand that accessing a game's internal data structures is indeed very possible.
And even if I couldn't find that boolean, I'd at least try to hook the point where it tries to draw that icon.
The idea of using image recognition on the screen is so horribly inefficient a method... I suppose it could be used if absolutely nothing else works, but really that's about it.
Re: (Score:2)
I tried this once and never came very far - can you give me a few 'pointers' on where to find resources and websites that explain these techniques?
Re: (Score:2, Interesting)
Except to do something like that you have to analyze the program code of every game you make a trainer for. I've never done that sort of thing, but it seems scary.
An approach like the one they've outlined can probably be moved from game to game with only parameter tweaks.
That is the true beauty of machine learning :)
Re: (Score:1)
I can assure you first hand that accessing a game's internal data structures is indeed very possible.
It could actually be simpler to do it his way. Icons never move, and he already knows where they are going to be on the screen, so no need to figure out what the data under the hood looks like. While it's certainly more processor intensive, and probably less elegant, Warcraft 3 is a pretty old game, so I don't think it's a big deal for most computers to handle the extra load. Or he could just be doing it cause it's more fun this way :)
Re: (Score:2)
Accessing a game's internal data structures could often be against the EULA, whereas reading your graphics card's frame buffer isn't.
Re: (Score:2)
Re: (Score:2)
i would have tried non-negative matrix factorization personally.
Re: (Score:1)
Re: (Score:1)
Ok, I skimmed the article.
They don't need to find the image - they start out knowing where it is, so stuff like template matching is irrelevant.
And since the NN started out knowing where the icons were, figuring out which ones matched which signals is not as impressive. It's just a ho-hum NN application, now.
Re: (Score:3, Informative)
Me thinks someone was just doing this for entertainment.
Almost certainly, especially since a complete success would just mean they can play video games slightly more efficiently.
This toolkit worked for them, but does using a neural networking toolkit mean that what you produce is a neural network? It looks like the output neurons are doing image matching, and the hidden layer is identifying interesting candidates from a stream. In their environment, interesting candidates are any box that ticks from dim to bright (so they can spot the re-charged state when it
Re: (Score:1, Informative)
Re: (Score:2)
see http://neil.fraser.name/writing/tank/ [fraser.name]
Re: (Score:1)
All your link demonstrates is that you need to get your training data correct. This doesn't tell me anything about why neural networks are "a bad idea".
Re: (Score:3, Interesting)
The point is that you have no way of determining what training data is "correct" because you don't know anything about what the NN is "looking at".
There also no guarantee that the network will ever converge and if it does there's no way to know if it has converged to a local minimum which isn't the solution rather than a global minimum.
Re:Hilarious Overkill (Score:5, Insightful)
IN JAVA. Speed was obviously not in the design criteria.
The '90s are over. Java is now one of the fastest languages around.
Re: (Score:2, Funny)
The '90s are over. Java is now one of the fastest languages around.
Around where?
Everywhere, that's the cool thing about java, you know.
Re:Hilarious Overkill (Score:4, Insightful)
http://en.wikipedia.org/wiki/ARM_architecture [wikipedia.org] And that includes a few bilion ARM prosessors used in mobile phones. Sure they can run java, but it's nowhere near as fast as C.
Java is nice, but the 'runs everywhere' feature is the least interesting one of them all. I can run an emulated a full blown x86 on a 8bit microcontroller, but that does not make it useful.
Re: (Score:1, Interesting)
Did you miss the part where some ARM processors can execute Java byte code natively?
Re: (Score:2)
Except on everything not x86. The speed you currently see on desktop/server java is only accomplished by very good JustInTime compilers. Which are tweaked for x86. So everything else runs java like crap.
Have you not heard of Jazelle [arm.com]? It supprots just in time and ahead of time compilers for ARM and is mentioned in the wikipedia link you gave.
It provides direct execution of Java bytecode and was announced in 2000 [eetimes.com].
Re: (Score:1, Informative)
Is your application CPU-limited? If so, is it *the* fastest language? Those are the questions one should be asking when picking a programming language.
If your application is limited by the CPU, only the fastest language, C, will do for some routines. You may even consider using assembly or machine-optimized code such as Atlas [sourceforge.net]
If your application isn't limited by the CPU, then development speed is more important than execution speed. A rule of thumb I use is ho
Re:Hilarious Overkill (Score:5, Insightful)
Is your application CPU-limited?
No, it's developer-limited. For most applications, development time is a bigger issue than execution speed. Only for very heavily used low-level routines (OS stuff, graphics libraries, VMs, etc) is it really worthwhile spending extra effort on extreme optimisation.
If so, is it *the* fastest language?
I don't have any recent benchmarks, but I remember that back in the days of the Java 5 JVM, Java is about 10% slower than equivalent C++, which is pretty good. But since then, JVMs have gotten quite a bit faster. It would surprise me if Java was not on at least equal terms with C++ now, alhough highly optimised low-level C is still going to be faster. But that's also extremely tedious to code.
Those are the questions one should be asking when picking a programming language.
No, the main question you need to ask when picking a language is if your code is going to be maintainable, and how expensive you can afford your maintainance to be. That's still the main timesink in development.
If your application is limited by the CPU, only the fastest language, C, will do for some routines. You may even consider using assembly or machine-optimized code such as Atlas [sourceforge.net]
You accidentally hit the nail right on the head there: C is not necessarily the fastest language, highly optimised custom assembly is. And any language is only as fast as it can be if the programmer knows what he's doing. Some language do more for you to make optimal code easy to write than others.
Java development, in my experience, is more laborious than Python or Ruby. Unless you have big teams of developers who must work close together, I wouldn't recommend Java for anything.
Oh, I agree, Java stopped being an easy development language quite some time ago, and moved to the side of the fast execution languages. This is also why I switched from Java to Ruby. However, I just might switch to Scala because recent JVMs are so incredibly cool. The power of Java these days is more in the awesomeness of the JVM than in the language itself.
Even so, there is an enormous amount of support for Java. It is by far the biggest language for enterprisey server stuff. I think there are as many webframeworks for Java as there are for all other programming languages put together. This is one of the big stengths of Java, but at the same time, this architectural overload is also one of the major hurdles for starting in Java.
However, my point was that Java is pretty fast, which it is. If speed is an issue, Java can be an excellent choice (unlike Ruby, for example). If speed is the only thing that matters, then highly optimised C or assembly is really the only option.
Re: (Score:2)
The "problem", as you pointed out, is that Java has a lot of options while Ruby has Rails and Python has Django and that's about it. You could build a similar tool set with Java/stack with Java to increase productivity. There are Java MVC frameworks, CRUD application generators, persistence strategies, code generators, and so on. A good Java IDE [netbeans.org] helps peice them together.
It could/should be better though but with the right set of tools/frameworks/libraries you can be very productive in Java.
Re: (Score:2)
If your application is limited by the CPU, you'll need to parallelize it to take advantage of multicore CPUs. C is probably the worst possible choice there, with the possible exception of assembly.
Furthermore, while low-level languages allow for all kinds of tricks in the hands of a master, the chances are that you are close to an av
Re: (Score:2)
After the half hour it takes to load. ;)
Re: (Score:2)
Re: (Score:1)
Languages aren't fast -- compilers are efficient. How "fast" a language is perceived to be is purely a function of whether an efficient compiler exists for the particular platform.
Plus, you can always compile Java to native code.
Re: (Score:3, Insightful)
Re: (Score:1, Redundant)
Re: (Score:1, Flamebait)
Java had a horrible case of too many cooks.
Yeah, screw all this collaboration crap. Lets get back to a single vendor proprietary solution. Someone please inform sourceforge that we will no longer be needing their services.
Re: (Score:2)
Re: (Score:2)
Deprecated APIs have not been removed, and haven't changed nearly that often. So what are you really complaining about?
Re: (Score:2)
If anything, MS keeps things far too long. Ars wrote up an excellent article on it. Here's a snip:
For example, there's a function called OpenFile. OpenFile was a Win16 function. It opens files, obviously enough. In Win32 it was deprecated--kept in, to allow 16-bit apps to be ported to Win32 more easily, but deprecated all the same. In Win32 it has always been deprecated. The documentation for OpenFile says, "Note: Only use this function with 16-bit versions of Windows. For newer applications, use the Crea
Re: (Score:1)
The good thing about java is that when the api changes you still have the option of running older JREs. I'm not sure if that's the case with the net framework.
The .NET Framework is similar to the JRE is this regard. You can target a build to use whichever version of the .NET Runtime you choose. That is, if your application targets the .NET Framework 2.0, it will work wherever the .NET 2.0 Runtime is installed, regardless of any changes made in the .NET Framework 3.5.
.NET Framework is to abstract development away from the OS API. It's similar to the approach used with Java, although primarily
As far as the underlying Windows API changes, a motivation for the
Re: (Score:2)
Being somewhat serious, why does the slashdot groupthink give C# a free pass whereas java gets all the hate? I haven't looked, but I assumed both are similar in performance.
.NET (including C#) keeps more metadata around for the JIT to use, which could theoretically result in improved performance.
One example is generics: if you use ArrayList<int> in Java, you end up with a bunch of boxing/unboxing overhead at runtime, because the compiler erases the type and turns it into a list of objects. List<int> in .NET remains a list of primitives, so there's no boxing.
On the other hand, I suspect more effort has gone into optimizing JVMs than optimizing the CLR, so I'm not su
Re: (Score:1)
One example is generics: if you use ArrayList<int> in Java, you end up with a bunch of boxing/unboxing overhead at runtime, because the compiler erases the type and turns it into a list of objects. List<int> in .NET remains a list of primitives, so there's no boxing.
While C# is better in some regards, I don't think your description of Java is quite correct. AFAIK the boxing/unboxing occur at compile (to bytecode) time and the JVM sees *exactly* the same code as if you'd done List int -- that is, generics is strictly a syntactic sugar in Java which can only be detected at compile time. Absolutely, it's time for Java to die and rise from the ashes as Java2, or just die completely and let JRuby/scala/whatever take over. Sure, c# is a tad better in some regards, but wha
Re: (Score:2)
AFAIK the boxing/unboxing occur at compile (to bytecode) time and the JVM sees *exactly* the same code as if you'd done List int -- that is, generics is strictly a syntactic sugar in Java which can only be detected at compile time.
I can't tell if you're disagreeing: that sounds just like what I said.
The Java compiler turns all ArrayList<T> types into the same ArrayList type. The "T" part is erased, and the compiler inserts casts or boxing/unboxing operations to convert between "T" and "object", effectively changing a statement like "list.add(5)" into "list.add(new Integer(5))".
The boxing/unboxing opcodes are inserted at compile time, but those opcodes cause boxing/unboxing operations to take place at runtime. That means more ru
Re: (Score:2)
Moron.
I have a console written in Java. It opens in a fraction of a second.
Re: (Score:2)
Indeed.
I mean, I built an identical thing to what they've built (image recognition engine for fixed icons) once using the Perl regular expression engine [cpan.org] (mostly just to prove it could be done). It was pretty awesome.
But I have no illusions that it is the sort of thing that I should be promoting on Slashdot. ...oh wait...
Re: (Score:1, Informative)
That's a completely differente approach.
Neural network can detect similar images - the perl solution can detect only binary equivalent images.
They have choosen this solution as a proof of concept, not for speed.
Re: (Score:2)
Not necesarily, there's ways to implement the equivalent of alpha channel support in the regex.
I agree though, that you can't do a percentage match like the neural net does.
Re: (Score:3, Insightful)
So they designed and wrote a neural network for the sole purpose of identifying a limited set of icons? Seriously?
They used a library. And it's just an algorithm.
Re: (Score:3, Interesting)
Neuroph look pretty cool (Score:3, Informative)
I've use neural network and genetic programming a few time, in work. Its completely different to normal programming. Instead of understand a problem completely, and write a structured solution to the task. You get a network and try and train it until its output matches what you think the output should be, no programming involved.
Basshunter (Score:3, Funny)
I just got that obnoxious song out of my head, and now they want to put the whole game in there? No thanks!
Re: (Score:3, Funny)
Next, they'll let players play the game through an IRC bot named Anna...
SecuROM has... (Score:2, Funny)
BTDT (Score:1)
I've seen this on Gilligan's Island already, where the professor hooked electrodes up to the Beatles wannabees while they played.
What is the game it monitors? (Score:2)
What is DotA?
No, seriously; the article is of no use, and the link in the article to what appears to be the homepage of this DotA game that this code hacks on top of is also completely useless in telling me what kind of game this actually is; everything involved in this is a cryptic thicket of terms for this highly involved game I have never heard of.
Re: (Score:2)
Thanks! I've never played any incarnation of Warcraft, so almost nothing about this article made a damn bit of sense to me.
Re: (Score:2)
Had the same problem, absolutely nowhere does it actually state the full name of the game, I saw a few links to WoW stuff on there and figured it was some add-in for the latest expansion...