Page 1 of 1
AI on GPU
Posted: Tue Apr 27, 2010 12:45 pm
by scheherazade
Little application that I wrote...
3D Gradient descent based path planner.
Written as a general purpose GPU program.
Each AI has their own kernel on the GPU.
Scales well (1 vs 40 AIs runs ~the same speed)
http://www.youtube.com/watch?v=KzjAE6EeEso&fmt=22
-scheherazade
Re: AI on GPU
Posted: Tue Apr 27, 2010 6:45 pm
by Sabre
Very nice job

Was this for home or for work? Either way, it turned out very well! I've been wanting to do some CUDA programming for awhile, it's just the time element that has escaped me...
Re: AI on GPU
Posted: Tue Apr 27, 2010 7:23 pm
by scheherazade
Project for grad school.
Teacher gave no homeworks and no exams.
Just said "some up with something to entertain me"...
So I thought, ok, let's throw some AI onto the GPU.
It's not bad to pick up.
It's just C code with a few caveats.
I started with an example found here :
http://llpanorama.wordpress.com/2008/05 ... a-program/
Basically just takes an array of data from the host, squares it on the GPU, and returns it to the host.
Answered all the "how do I get started" questions. The rest is just grunt work.
-scheherazade
Re: AI on GPU
Posted: Wed Apr 28, 2010 9:07 am
by Sabre
Thanks for the pointer towards the example code, I'm going to have to see what I can come up with
