cocos2d supports both TTF (True Type Fonts) labels, and texture atlas labels.
(Please note that from cocos2d Version .7+ on, the label is added to it's layer via addChild: and not add: e.g. [self addChild:myLabel];)
Pros and Cons of TTF labels: ( CCLabel )
*
+ All the pros of TTF fonts: any size, kerning support, etc.
*
+ Easy to use. No need to use an external editor.
*
- The creation/update is very slow since a new texture will be created
Pros and Cons of texture atlas labels: ( CCLabelAtlas, CCBitmapFontAtlas )
*
+ The creation / update is very fast, since they don't create a new texture.
*
+ Fonts can be customized (shadows, gradients, blur, etc)
*
- Depends on external editors: AngelCode / Hiero editor, GIMP / Photoshop
Creating labels: Simple way
The easiest way to create a label is by using the CCLabel object. Example:
CCLabel *label = [CCLabel labelWithString:@"Hello World" fontName:@"Marker Felt" fontSize:24];
[self add: label];
fontName is the TTF font name to be used.
You can use your own custom TTF file. You just need to add the .ttf file to the project. Example of custom TTF file:
CCLabel *label = [CCLabel labelWithString:@"Hello World" fontName:@"Schwarzwald Regular" fontSize:24];
[self add: label];
*
cocos2d will try to load the font trying to use the FontLabel library.
*
If it fails it will use the UIFont class
Important: The size of the OpenGL texture will be automatically calculated based on the font size and font name.
Creating labels: Complex way
You can also create textures using this API:
CCLabel *left = [CCLabel labelWithString:@"Hello World" dimensions:CGSizeMake(480,50) alignment:UITextAlignmentLeft fontName:@"Marker Felt" fontSize:32];
[self add: left];
If you use this way, you should pass the dimension of OpenGL texture to be used. If the texture is not big enough, then only some parts of the label will be rendered.
Possible alignments:
*
UITextAlignmentLeft (left alignment)
*
UITextAlignmentCenter (center alignment)
*
UITextAlignmentRight (right alignment)==== Updating ====
Like any object that implements the CCLabelProtocol protocol you can update it using the setString method. Example:
[label setString: @"Hello World 2"];
Important: Every time you call setString a NEW OpenGL texture will be created. This means that setString is as slow as creating a new CCLabel. So, DO NOT use CCLabel objects if you need to update them frequently. Instead use CCLabelAtlas or CCBitmapFontAtlas.
Color
You can change the color of your fonts by simply calling the color parameter like so:
label.color = ccc3(0,0,0);
//or
label.color = ccc4(0,0,0,0);
ccc3 Example Colors:
white - (255,255,255)
black - (0,0,0)
blue - (0,0,255)
green- (0,255,0)
red - (255,0,0)
Grey – (84,84,84)
Brown – (165,42,42)
Pink – (255,192,203)
Purple – (160,32,240)
Yellow – (255,255,0)
Gold – (255,215,0)
Alignment
If you want to modify the alignment you can use the anchorPoint property. Example:
//left alignment
[label setAnchorPoint: ccp(0, 0.5f)];
// right alignment
[label setAnchorPoint: ccp(1, 0.5f)];
// center aligment (default)
[label setAnchorPoint: ccp(0.5f, 0.5f)];
Texture Atlas labels
There are 2 types of labels based on texture atlas:
*
CCBitmapFontAtlas
*
CCLabelAtlas
Introduction
The CCBitmapFontAtlas is the suggested way to create fast labels since:
*
The bitmap (image) can be customized with the editors
*
You can update/init the label without penalty
*
It is very flexible. Each letter of the label can be treated like an CCSprite
*
It has kerning support
The CCBitmapFontAtlas label parses the Angel Code Font format to create a label. To create these kind of labels, you can use any of these editors:
*
http://www.n4te.com/hiero/hiero.jnlp (java version)
*
http://slick.cokeandcode.com/demos/hiero.jnlp (java version)
*
http://www.angelcode.com/products/bmfont/ (windows only)
Java editors vs. Windows editor:
*
The Windows editor is the official Angel Code editor
*
Java editors: run on Mac
*
Java editors: have additional features like shadow, gradient, blur
Creating a BitmapFontAtlas
To create a CCBitmapFontAtlas object you need to do:
CCBitmapFontAtlas *label = [CCBitmapFontAtlas bitmapFontAtlasWithString:@"Hello World" fntFile:@"bitmapFontTest.fnt"];
[self add:label]
Manipulating each character
Since CCBitmapFontAtlas is a subclass of CCSpriteSheet you can manipulate each character as an CCSprite. The 1st character will be added with tag = 0, the 2nd character will be added with tag=1, and so on. Example:
CCBitmapFontAtlas *label = [CCBitmapFontAtlas bitmapFontAtlasWithString:@"Bitmap Font Atlas" fntFile:@"bitmapFontTest.fnt"];
CCSprite *char_B = (CCSprite*) [label getChildByTag:0]; // character 'B'
CCSprite *char_m = (CCSprite*) [label getChildByTag:3]; // character 'm'
LabelAtlas
Introduction
CCLabelAtlas was the 1st fast label added into cocos2d. But it was superseded by CCBitmapFontAtlas. It is being maintained for backwards compatibility, but you should use CCBitmapFontAtlas instead.
Creating a LabelAtlas
CCLabelAtlas *label = [CCLabelAtlas labelAtlasWithString:@"Hello World" charMapFile:@"tuffy_bold_italic-charmap.png" itemWidth:48 itemHeight:64 startCharMap:' '];
[self add:label];
*
charMapFile is an image file that contains all the characters. Each character should be ordered according to its ASCII value and the image can't contain more than 256 characters.
*
itemWidth is the width of the characters in pixels
*
itemHeight is the height of the characters in pixels
*
startCharMap is the first character of the map.
Updating a LabelAtlas / BitmapFontAtlas
Like any object that implements the CCLabelProtocol protocol you can update it using the setString method.
[label setString:@"Hello World 2"];
It is worth noting that updating a CCLabelAtlas or a CCBitmapFontAtlas has almost no penalty.
Alignment in LabelAtlas / BitmapFontAtlas
If you want to modify the alignment you can use the anchorPoint property. Example:
//left alignment
[label setAnchorPoint: ccp(0, 0.5f)];
// right alignment
[label setAnchorPoint: ccp(1, 0.5f)];
// center aligment (default)
[label setAnchorPoint: ccp(0.5f, 0.5f)];
iPhone Top Tips is for developers who are working on iPhone Application Development or Games Development. I post here my past experience and new techniques which will useful for you guys.
Friday, November 12, 2010
Tuesday, November 9, 2010
Implement Push Notification in iPhone Game or Application
One of the widely anticipated features of the new iPhone OS 3.0 is push notifications which allow messages to be sent directly to an individual device relevant to the application that has been installed. Apple have demoed this as useful for news alerts, or IM notifications however it fits in perfectly with the nature of our server monitoring service, Server Density.
As part of the product, we have an iPhone application that includes push notifications as an alerting option so you can be notified via push direct to your iPhone when one of your server alerts have been triggered. This is useful since our app can then be launched to instantly see the details of the server that has caused the alert.
Apple provides detailed code documentation for the iPhone OS code that is needed to implement and handle the alerts on the device but only provides a higher level guide for the provider server side.
As a provider, you need to communicate with the Apple Push Notification Service (APNS) to send the messages that are then pushed to the phone. This is necessary so that the device only needs to maintain 1 connection to the APNS, helping to reduce battery usage.
This tutorial will go into code-level detail about how we built our push notification provider server to allow us to interact with the APNS and use the push notifications with our server monitoring iPhone application. Since we develop in PHP, our examples will be in PHP 5.
Basic Structure
1. You connect to the APNS using your unique SSL certificate
2. Cycle through the messages you want to send (or just send 1 if you only have 1)
3. Construct the payload for each message
4. Disconnect from APNS
The flow of remote-notification data is one-way. The provider composes a notification package that includes the device token for a client application and the payload. The provider sends the notification to APNs which in turn pushes the notification to the device.
Restrictions
* The payload is limited to 256 bytes in total – this includes both the actual body message and all of the optional and additional attributes you might wish to send. Push notifications are not designed for large data transfer, only for small alerts. For example we only send a short alert message detailing the server monitoring alert triggered.
* APNS does not provide any status feedback as to whether your message was successfully delivered. One reason for this is that messages are queued to be sent to the device if it is unreachable, however only the last sent message will be queued – overwriting any previously sent but undelivered messages.
* Push notifications should not be used for critical alerts because the message will only be delivered if the device has wifi or cellular connectivity, which is why we recommend combining push with another alerting method such as e-mail or SMS for our server monitoring alerts.
* The SSL certificates used to communicate with APNS, discussed below, are generated on an application level. The implementation discussed in this tutorial only concerns a single iPhone application so if you have several, you will need to adapt the code to use the appropriate certificate(s) where necessary.
Device Token
Each push message must be “addressed” to a specific device. This is achieved by using a unique deviceToken generated by APNS within your iPhone application. Once this token has been retrieved, you need to store it on your server, not within your iPhone application itself. It looks something like this:
c9d4c07c fbbc26d6 ef87a44d 53e16983 1096a5d5 fd825475 56659ddd f715defc
For the Server Density iPhone application, we call the necessary generation methods on app launch and pass it back to our servers via an HTTP API call. This stores the deviceToken in a database on our servers for that user so we can then communicate with the device linked to that user.
Feedback Service
Apple provide a feedback service which you are supposed to occasionally poll. This will provide a list of deviceTokens that were previously but are no longer valid, such as if the user has uninstalled your iPhone application. You can then remove the deviceToken from your database so you do not communicate with an invalid device.
Using the feedback service is not covered by this tutorial.
Certificates
The first thing you need is your Push certificates. These identify you when communicating with APNS over SSL.
Generating the Apple Push Notification SSL certificate on Mac:
1. Log in to the iPhone Developer Connection Portal and click App IDs
2. Ensure you have created an App ID without a wildcard. Wildcard IDs cannot use the push notification service. For example, our iPhone application ID looks something like AB123346CD.com.serverdensity.iphone
3. Click Configure next to your App ID and then click the button to generate a Push Notification certificate. A wizard will appear guiding you through the steps to generate a signing authority and then upload it to the portal, then download the newly generated certificate. This step is also covered in the Apple documentation.
4. Import your aps_developer_identity.cer into your Keychain by double clicking the .cer file.
5. Launch Keychain Assistant from your local Mac and from the login keychain, filter by the Certificates category. You will see an expandable option called “Apple Development Push Services”
6. Expand this option then right click on “Apple Development Push Services” > Export “Apple Development Push Services ID123″. Save this as apns-dev-cert.p12 file somewhere you can access it.
7. Do the same again for the “Private Key” that was revealed when you expanded “Apple Development Push Services” ensuring you save it as apns-dev-key.p12 file.
8. These files now need to be converted to the PEM format by executing this command from the terminal:
openssl pkcs12 -clcerts -nokeys -out apns-dev-cert.pem -in apns-dev-cert.p12
openssl pkcs12 -nocerts -out apns-dev-key.pem -in apns-dev-key.p12
9. If you wish to remove the passphrase, either do not set one when exporting/converting or execute:
openssl rsa -in apns-dev-key.pem -out apns-dev-key-noenc.pem
10. Finally, you need to combine the key and cert files into a apns-dev.pem file we will use when connecting to APNS:
cat apns-dev-cert.pem apns-dev-key-noenc.pem > apns-dev.pem
It is a good idea to keep the files and give them descriptive names should you need to use them at a later date. The same process above applies when generating the production certificate.
Payload Contents
The payload is formatted in JSON, compliant with the RFC 4627 standard. It consists of several parts:
* Alert – the text string to display on the device
* Badge – the integer number to display as a badge by the application icon on the device home screen
* Sound – the text string of the name of the sound to accompany the display of the message on the device
* This tutorial will only deal with the basics by sending a simple alert text string but this can also be another dictionary containing various options to display custom buttons and the like.
Creating the payload
Using PHP it is very easy to create the payload based on an array and convert it to JSON:
$payload['aps'] = array('alert' => 'This is the alert text', 'badge' => 1, 'sound' => 'default');
$payload = json_encode($payload);
Echoing the contents of $payload would show you the JSON string that can be sent to APNS:
{
"aps" : { "alert" : "This is the alert text", "badge" : 1, "sound" : "default" }
}
This will cause a message to be displayed on the device, trigger the default alert sound and place a “1″ in the badge by the application icon. The default buttons “Close” and “View” would also appear on the alert that pops up.
For the Server Density server monitoring iPhone application, it is important for the user to be able to tap “View” and go directly to the server that generated the alert. To do this, we add an extra dictionary in of our own custom values:
$payload['aps'] = array('alert' => 'This is the alert text', 'badge' => 1, 'sound' => 'default');
$payload['server'] = array('serverId' => $serverId, 'name' => $name);
$output = json_encode($payload);
The custom dictionary server is passed to the application on the device when the user taps “View” so we can load the right server. The JSON looks like this:
{
"aps" : { "alert" : "This is the alert text", "badge" : 1, "sound" : "default" },
"server" : { "serverId" : 1, "name" : "Server name")
}
The size limit of 256 bytes applies to this entire payload, including any custom dictionaries.
The raw interface
Once an alert is generated within Server Density, the payload is built and then inserted into a queue. This is processed separately so that we can send multiple payloads in one go if necessary.
Apple recommends this method because if you are constantly connecting and disconnecting to send each payload, APNS may block your IP.
As described by Apple:
The raw interface employs a raw socket, has binary content, is streaming in nature, and has zero acknowledgment responses.
Opening the connection
The PHP 5 code to open the connection looks like this:
$apnsHost = 'gateway.sandbox.push.apple.com';
$apnsPort = 2195;
$apnsCert = 'apns-dev.pem';
$streamContext = stream_context_create();
stream_context_set_option($streamContext, 'ssl', 'local_cert', $apnsCert);
$apns = stream_socket_client('ssl://' . $apnsHost . ':' . $apnsPort, $error, $errorString, 2, STREAM_CLIENT_CONNECT, $streamContext);
If an error has occurred you can pick up the error message from $errorString. This will also contain the details if your SSL certificate is not correct.
The certificate file is read in relative to the current working directory of the executing PHP script, so specify the full absolute path to your certificate if necessary.
Note that when testing you must use the sandbox with the development certificates. The production hostname is gateway.push.apple.com and must use the separate and different production certificate.
Sending the payload
At this point, the code we use loops through all the queued payloads and sends them. Constructing the binary content to send to APNS is simple:
$apnsMessage = chr(0) . chr(0) . chr(32) . pack('H*', str_replace(' ', '', $deviceToken)) . chr(0) . chr(strlen($payload)) . $payload;
fwrite($apns, $apnsMessage);
Note that the $deviceToken is included from our database and stripped of the spaces it is provided with by default. We also include a check to send an error to us in the event that the $payload is over 256 bytes.
$apnsMessage contains the correctly binary formatted payload and the fwrite call writes the payload to the currently active streaming connection we opened previously, contained in $apns.
Once completed, you can close the connection:
socket_close($apns);
fclose($apns);
php-apns
There is a free, open source server library that does all the above functionality called php-apns. We chose to implement it ourselves because it has a further dependancy on memcached, we do not want to rely on 3rd party code for large and critical aspects of our code-base and I am apprehensive about the suitability of PHP for running a continuous server process. We do all the above queue processing using our own custom cron system which runs every few seconds – that way PHP scripts do not need to be run as processes, something I’m not sure they were designed to do!
All done
That’s it! If you have any problems, post in the comments below.
As part of the product, we have an iPhone application that includes push notifications as an alerting option so you can be notified via push direct to your iPhone when one of your server alerts have been triggered. This is useful since our app can then be launched to instantly see the details of the server that has caused the alert.
Apple provides detailed code documentation for the iPhone OS code that is needed to implement and handle the alerts on the device but only provides a higher level guide for the provider server side.
As a provider, you need to communicate with the Apple Push Notification Service (APNS) to send the messages that are then pushed to the phone. This is necessary so that the device only needs to maintain 1 connection to the APNS, helping to reduce battery usage.
This tutorial will go into code-level detail about how we built our push notification provider server to allow us to interact with the APNS and use the push notifications with our server monitoring iPhone application. Since we develop in PHP, our examples will be in PHP 5.
Basic Structure
1. You connect to the APNS using your unique SSL certificate
2. Cycle through the messages you want to send (or just send 1 if you only have 1)
3. Construct the payload for each message
4. Disconnect from APNS
The flow of remote-notification data is one-way. The provider composes a notification package that includes the device token for a client application and the payload. The provider sends the notification to APNs which in turn pushes the notification to the device.
Restrictions
* The payload is limited to 256 bytes in total – this includes both the actual body message and all of the optional and additional attributes you might wish to send. Push notifications are not designed for large data transfer, only for small alerts. For example we only send a short alert message detailing the server monitoring alert triggered.
* APNS does not provide any status feedback as to whether your message was successfully delivered. One reason for this is that messages are queued to be sent to the device if it is unreachable, however only the last sent message will be queued – overwriting any previously sent but undelivered messages.
* Push notifications should not be used for critical alerts because the message will only be delivered if the device has wifi or cellular connectivity, which is why we recommend combining push with another alerting method such as e-mail or SMS for our server monitoring alerts.
* The SSL certificates used to communicate with APNS, discussed below, are generated on an application level. The implementation discussed in this tutorial only concerns a single iPhone application so if you have several, you will need to adapt the code to use the appropriate certificate(s) where necessary.
Device Token
Each push message must be “addressed” to a specific device. This is achieved by using a unique deviceToken generated by APNS within your iPhone application. Once this token has been retrieved, you need to store it on your server, not within your iPhone application itself. It looks something like this:
c9d4c07c fbbc26d6 ef87a44d 53e16983 1096a5d5 fd825475 56659ddd f715defc
For the Server Density iPhone application, we call the necessary generation methods on app launch and pass it back to our servers via an HTTP API call. This stores the deviceToken in a database on our servers for that user so we can then communicate with the device linked to that user.
Feedback Service
Apple provide a feedback service which you are supposed to occasionally poll. This will provide a list of deviceTokens that were previously but are no longer valid, such as if the user has uninstalled your iPhone application. You can then remove the deviceToken from your database so you do not communicate with an invalid device.
Using the feedback service is not covered by this tutorial.
Certificates
The first thing you need is your Push certificates. These identify you when communicating with APNS over SSL.
Generating the Apple Push Notification SSL certificate on Mac:
1. Log in to the iPhone Developer Connection Portal and click App IDs
2. Ensure you have created an App ID without a wildcard. Wildcard IDs cannot use the push notification service. For example, our iPhone application ID looks something like AB123346CD.com.serverdensity.iphone
3. Click Configure next to your App ID and then click the button to generate a Push Notification certificate. A wizard will appear guiding you through the steps to generate a signing authority and then upload it to the portal, then download the newly generated certificate. This step is also covered in the Apple documentation.
4. Import your aps_developer_identity.cer into your Keychain by double clicking the .cer file.
5. Launch Keychain Assistant from your local Mac and from the login keychain, filter by the Certificates category. You will see an expandable option called “Apple Development Push Services”
6. Expand this option then right click on “Apple Development Push Services” > Export “Apple Development Push Services ID123″. Save this as apns-dev-cert.p12 file somewhere you can access it.
7. Do the same again for the “Private Key” that was revealed when you expanded “Apple Development Push Services” ensuring you save it as apns-dev-key.p12 file.
8. These files now need to be converted to the PEM format by executing this command from the terminal:
openssl pkcs12 -clcerts -nokeys -out apns-dev-cert.pem -in apns-dev-cert.p12
openssl pkcs12 -nocerts -out apns-dev-key.pem -in apns-dev-key.p12
9. If you wish to remove the passphrase, either do not set one when exporting/converting or execute:
openssl rsa -in apns-dev-key.pem -out apns-dev-key-noenc.pem
10. Finally, you need to combine the key and cert files into a apns-dev.pem file we will use when connecting to APNS:
cat apns-dev-cert.pem apns-dev-key-noenc.pem > apns-dev.pem
It is a good idea to keep the files and give them descriptive names should you need to use them at a later date. The same process above applies when generating the production certificate.
Payload Contents
The payload is formatted in JSON, compliant with the RFC 4627 standard. It consists of several parts:
* Alert – the text string to display on the device
* Badge – the integer number to display as a badge by the application icon on the device home screen
* Sound – the text string of the name of the sound to accompany the display of the message on the device
* This tutorial will only deal with the basics by sending a simple alert text string but this can also be another dictionary containing various options to display custom buttons and the like.
Creating the payload
Using PHP it is very easy to create the payload based on an array and convert it to JSON:
$payload['aps'] = array('alert' => 'This is the alert text', 'badge' => 1, 'sound' => 'default');
$payload = json_encode($payload);
Echoing the contents of $payload would show you the JSON string that can be sent to APNS:
{
"aps" : { "alert" : "This is the alert text", "badge" : 1, "sound" : "default" }
}
This will cause a message to be displayed on the device, trigger the default alert sound and place a “1″ in the badge by the application icon. The default buttons “Close” and “View” would also appear on the alert that pops up.
For the Server Density server monitoring iPhone application, it is important for the user to be able to tap “View” and go directly to the server that generated the alert. To do this, we add an extra dictionary in of our own custom values:
$payload['aps'] = array('alert' => 'This is the alert text', 'badge' => 1, 'sound' => 'default');
$payload['server'] = array('serverId' => $serverId, 'name' => $name);
$output = json_encode($payload);
The custom dictionary server is passed to the application on the device when the user taps “View” so we can load the right server. The JSON looks like this:
{
"aps" : { "alert" : "This is the alert text", "badge" : 1, "sound" : "default" },
"server" : { "serverId" : 1, "name" : "Server name")
}
The size limit of 256 bytes applies to this entire payload, including any custom dictionaries.
The raw interface
Once an alert is generated within Server Density, the payload is built and then inserted into a queue. This is processed separately so that we can send multiple payloads in one go if necessary.
Apple recommends this method because if you are constantly connecting and disconnecting to send each payload, APNS may block your IP.
As described by Apple:
The raw interface employs a raw socket, has binary content, is streaming in nature, and has zero acknowledgment responses.
Opening the connection
The PHP 5 code to open the connection looks like this:
$apnsHost = 'gateway.sandbox.push.apple.com';
$apnsPort = 2195;
$apnsCert = 'apns-dev.pem';
$streamContext = stream_context_create();
stream_context_set_option($streamContext, 'ssl', 'local_cert', $apnsCert);
$apns = stream_socket_client('ssl://' . $apnsHost . ':' . $apnsPort, $error, $errorString, 2, STREAM_CLIENT_CONNECT, $streamContext);
If an error has occurred you can pick up the error message from $errorString. This will also contain the details if your SSL certificate is not correct.
The certificate file is read in relative to the current working directory of the executing PHP script, so specify the full absolute path to your certificate if necessary.
Note that when testing you must use the sandbox with the development certificates. The production hostname is gateway.push.apple.com and must use the separate and different production certificate.
Sending the payload
At this point, the code we use loops through all the queued payloads and sends them. Constructing the binary content to send to APNS is simple:
$apnsMessage = chr(0) . chr(0) . chr(32) . pack('H*', str_replace(' ', '', $deviceToken)) . chr(0) . chr(strlen($payload)) . $payload;
fwrite($apns, $apnsMessage);
Note that the $deviceToken is included from our database and stripped of the spaces it is provided with by default. We also include a check to send an error to us in the event that the $payload is over 256 bytes.
$apnsMessage contains the correctly binary formatted payload and the fwrite call writes the payload to the currently active streaming connection we opened previously, contained in $apns.
Once completed, you can close the connection:
socket_close($apns);
fclose($apns);
php-apns
There is a free, open source server library that does all the above functionality called php-apns. We chose to implement it ourselves because it has a further dependancy on memcached, we do not want to rely on 3rd party code for large and critical aspects of our code-base and I am apprehensive about the suitability of PHP for running a continuous server process. We do all the above queue processing using our own custom cron system which runs every few seconds – that way PHP scripts do not need to be run as processes, something I’m not sure they were designed to do!
All done
That’s it! If you have any problems, post in the comments below.
Friday, November 5, 2010
Game Engine and Games Framework List
Freeware engines
These engines are available for free use, but without the source code being available under an open source license. Many of these engines are commercial products which have a free edition available for them:
* Adventure Game Studio — Mainly used to develop third-person pre-rendered adventure games, this engine is one of the most popular for developing amateur adventure games.
* Cocos2d— A 2d game engine for making iphone games.
* DikuMUD and derivatives — MUD engines
* dim3 — Freeware 3D javascript engine for the Mac (although finished games are cross platform).
* DX Studio — A freeware 3D game engine with complete tools for 3D video game development. Upgrading to paid licenses would unlock extra features.
* Game Maker Lite — Object-oriented game development software with a scripting language as well as a drag-and-drop interface.
* LPMud and derivatives (including MudOS and FluffOS) — MUD engines
* MUSH — MU* engine
* M.U.G.E.N — A 2D fighting game engine.
* Open Scene Graph — An open source 3D graphics toolkit, used by application developers in fields such as visual simulation, computer games, virtual reality, scientific visualization and modelling.
* Panda3D — (Releases prior to May 28, 2008) A relatively easy to use C++ game engine with Python bindings that was made by Disney and is owned by Carnegie Mellon University. Disney uses it to produce some of their games.
* Platinum Arts Sandbox Free 3D Game Maker — Open source and based on the Cube 2: Sauerbraten engine with a focus on game creation and designed for kids and adults. The program includes Non commercial content, but the main engine and large majority of the media can be used commercially. The Exciting Adventures of Master Chef Ogro was created using this engine by High School students.
* TinyMUCK — MU* engine
* TinyMUD — MU* engine
* Unity — An open-ended 3D game/interactive software engine for web, Windows, and Mac OS X. Upgrading to paid licenses can additionally enable support for the iPhone, Android and Nintendo Wii.
* World Builder — A classic Mac OS game engine.
* Wintermute Engine — A runtime and development tools for creating 2D and 2.5D point'n'click adventure games.[6][7]
* RGSS — An engine made by enterbrain to create RPG's using RPG Maker XP. RGSS2 was used for RPG Maker VX.
[edit] Commercial engines
* Alamo — the engine used in Star Wars: Empire at War by Petroglyph Games.
* Aurora Engine — For Role-playing games.
* Bork3D Game Engine — A cross-platform game engine primarily targeting iPhone and iPad.
* BigWorld — Server, client and development tools for the development of MMOG for games that run on Windows, Xbox 360, and PS3.
* BRender — A real-time 3D graphics engine for computer games, simulators and graphic tools.
* C4 Engine — A cross-platform game engine developed by Terathon Software.
* Cafu Engine — A game engine with development tools for creating multiplayer, cross-platform, real-time 3D games and applications.
* Coldstone game engine — An old game creation suite for Macintosh/Windows to create role-playing or adventure-style games.
* Corona SDK — A cross-platform, Lua-based game engine that can build games to the iPhone, iPad, or Android devices from the same set of code.
* CPAL3D — Complete game creation tools with scene editor, IDE and text server.
* CryEngine, CryEngine 2, CryEngine 3, CryEngine 3.5 — The game engine used for the first-person shooter computer game Far Cry. CryEngine 2 is a new generation engine developed by Crytek to create the FPS game Crysis.
* Crystal Tools — Square Enix's proprietary seventh generation game engine.
* DX Studio — Engine and editing suite that allows creation of real-time games and simulations.
* Dunia Engine — Engine (heavily modified version of the CryEngine) made especially for Far Cry 2 by Ubisoft Montreal.
* Earth-4 Engine — The graphics engine used in Earth 2160
* Electron engine — Developed by Obsidian Entertainment for their game Neverwinter Nights 2, based on the Aurora engine.
* Elflight Engine — Cross-platform 3D streaming game engine designed from the ground up for use over the Web. Games can play in a web browser window, in a separate window or full-screen. Java and OpenGL based.
* Enigma Engine — A real-time tactics game engine, used in Blitzkrieg.
* Esperient Creator — A very powerful 3D modeler and engine, used world wide for training, simulation, architecture, and games. Built-in Scripting, C/C++, CScript, or Lisp, Shader Editor, import 50+ 3D formats.
* Euphoria — This is a biomechanical Ragdoll engine by NaturalMotion.
* Freescape (1986) — Incentive Software; One of the first proprietary 3D game engines, used in Driller and 3D Construction Kit.
* Frostbite Engine — Game engine used for the next-gen title Battlefield: Bad Company.
* Gamebryo — Cross-platform game middleware for professional developers, notable for its rapid development.
* GameSalad — A 2D game engine that currently targets the iPhone and a Apple Safari Web-plugin developed by Gendai Games. Has a visual programming interface that allows for rapid development.
* Gamestudio — A 2D and 3D game engine for beginners. Uses the Gamestudio development system and the lite-C programming language.
* Glacier, Glacier2 — Developed by IO Interactive and used for the Hitman series of games. Glacier2 is a new generation engine currently in development for upcoming games.[8]
* Gogii Games Engine - a 2d multi-platform C++ engine supporting PC, Mac, iPhone and iPad. Used in casual games such as the Mortimer Beckett series.
* GrimE — Used in LucasArts graphical adventure games starting with Grim Fandango.
* Hedgehog Engine — Created by the Sonic Team with the capability of rendering high quality graphics at high speed. It was first used in Sonic Unleashed.
* HeroEngine — 3D game engine by Simutronics for building MMOs in a live collaborative environment.
* HPL Engine 2 — Used in Frictional Games survival horror games. Earlier versions are free software.
* id Tech 4 — (Also known as Doom 3 engine) Used by the games Doom 3, Quake 4, Prey and Quake Wars. Will become Open Source with the release of RAGE in September 2011[9].
* id Tech 5 — Currently in development by id Software as engine for their games, Doom 4 and Rage, and as a general purpose engine to be licensed.
* IMUSE — Specifically designed to synchronize music with visual action.
* Infernal Engine — Created by Terminal Reality, provides rendering, physics, sound, AI, and metrics for game development. Used in several games such as Ghostbusters: The Video Game, Mushroom Men: The Spore Wars, Bass Pro Shops: The Strike and Roogoo: Twisted Towers.[10]
* INSANE — Used in LucasArts games.
* Infinity Engine — Allows the creation of isometric computer role-playing games.
* Jade engine — Developed by Ubisoft, originally for Beyond Good & Evil.
* Jedi — A game engine developed by LucasArts for Star Wars: Dark Forces and Outlaws.
* K2 Engine — An engine used in Heroes of Newerth and Savage2 by S2 Games.
* Kaneva Game Platform — A MMOG engine for independent and professional game development.
* Kinetica — A game engine developed by Sony for PlayStation 2.
* Leadwerks Engine — Leadwerks Engine is a 3D engine for rendering, sound, and physics in real-time games and simulations.
* Lemon Engine — Lemon Engine is a modular set of libraries for all aspects of game development across all major platforms.
* Lithtech Jupiter Ex — Developed by Monolith Productions to create the game F.E.A.R.
* LyN engine — Developed by Ubisoft, originally for Rabbids Go Home and Beyond Good & Evil 2.
* Medusa — A C++ 3D game engine developed by Palestar and used in the DarkSpace MMO. It features distributed world simulation, single tool version control and asset realisation, cross-platform compatibility and an integrated client/server network system.
* Monumental Technology Suite – A MMOG platform, including server and client technology and development / live management tools.
* MT Framework — Game engine created by Capcom and used for their games on Xbox 360, PlayStation 3 and PC.
* Multimedia Fusion 2 — A 2D game development system.
* Multiverse Network — An MMOG platform, including server, client, and tools. (Free for development and use — revenue sharing upon commercial deployment).
* Odyssey Engine — Used to create three dimensional computer role-playing games, used in Star Wars: Knights of the Old Republic
* Onyx Engine — Developed by Ubisoft
* Pie in the Sky — Used in two internal games by Pie in the Sky Software and then in the 3D Game Creation System and the games made with it.
* PhyreEngine — A cross platform (PC & PS3) graphics engine from Sony Computer Entertainment.
* Q (game engine) — A fully pluggable, extensible and customisable framework and tools from Qube Software for PC, Wii, PS2, PS3, Xbox, Xbox 360, PSP, iPhone etc. created by the team behind Direct3D.
* RAGE — A game engine created by Rockstar Games to power their upcoming video games on the Xbox 360 and PlayStation 3. Implemented in Grand Theft Auto 4.
* RelentENGINE — A next-generation FPS engine supporting massive destroyable city environments and realistic vehicle control, makes extensive use of shader model 3.
* RenderWare — A 3D API and graphics rendering engine.
* Revolution3D — A 3D graphics engine developed by X-Dream Project.
* RPG Maker VX — A 2D engine to make top-down and isometric-style role-playing games for Windows.
* RPG Maker XP — A 2D engine to make top-down and isometric-style role-playing games for Windows.
* RPG Maker 2003 — A 2D engine to make top-down and isometric-style role-playing games for Windows.
* RPG Maker 95 — A 2D engine to make top-down and isometric-style role-playing games for Windows.
* SAGE engine — Used to create real-time strategy games.
* Scaleform — A vector graphics rendering engine used to display Adobe Flash-based user interfaces, HUDs, and animated textures for games in PC, Mac, Linux, Xbox 360, PlayStation 2, PlayStation Portable, PlayStation 3, and Wii.
* SCUMM engine — Used in LucasArts graphical adventure games.
* Serious Engine — The engine by Croteam used in the epic Serious Sam: The First Encounter and The Second Encounter.
* Shark 3D — A middleware from Spinor for computer, video games and realtime 3D applications.
* ShiVa — A game engine with an authoring tool to produce 3d real time applications for Windows, Mac OS X, Linux, WebOS, Android, and iPhone.
* Silent Storm engine — A turn-based tactics/tactical RPG game engine, used in Silent Storm.
* Sith — A game engine developed by LucasArts for Jedi Knight: Dark Forces II.
* Source engine — A game engine developed by Valve Software for Half-Life 2.The SDK comes with Half Life 2
* Torque Game Engine — A modified version of a 3D computer game engine originally developed by Dynamix for the 2001 FPS Tribes 2.
* Torque Game Engine Advanced – A next-generation 3D game engine support modern GPU hardware and shaders.
* TOSHI — A fourth generation cross platform game engine designed by Blue Tongue Entertainment.
* Truevision3D — A 3D game engine using the DirectX API.
* Unigine — Cross-platform middleware engine.
* Unity — An open-ended 3D game/interactive software engine for web, Windows, Mac OS X, iOS (iPod, iPhone, and iPad), Android, and Nintendo Wii.
* Unreal Engine — A game engine for PC, Xbox 360 and PlayStation 3 .
* Vengeance engine — A video game engine based on the Unreal Engine 2/2.5
* Vicious Engine — Available for Microsoft Windows, Sony PlayStation 2, Microsoft Xbox, and Sony PlayStation Portable
* Virtools — A 3D engine combined with high-level development framework, used for game prototyping and rapid developments. Available for Windows, Macintosh, Xbox, PSP. Can publish standalone or for the 3DVia Web Player browser plugin.
* Vision Engine 8 — A cross-platform game engine, developed by Trinigy. Used in games such as: Arcania: A Gothic Tale, The Settlers 7: Paths to a Kingdom, Dungeon Hero, Cutthroat, and Three Investigators.
* Visual3D.NET Game Engine — All-in-One 3D game engine and toolset, fully written in C#/.NET for Windows A browser player is roadmapped for v1.1.
* WGAF — The game engine developed by Guild Software which powers their MMORPG Vendetta Online.
* X-Ray — The game engine developed by GSC Game World which powers their FPS series, "S.T.A.L.K.E.R".
* XnGine — Developed by Bethesda Softworks, one of the first true 3D engines.
* Zillions of Games — used to develop games that happen on a grid, like chess
These engines are available for free use, but without the source code being available under an open source license. Many of these engines are commercial products which have a free edition available for them:
* Adventure Game Studio — Mainly used to develop third-person pre-rendered adventure games, this engine is one of the most popular for developing amateur adventure games.
* Cocos2d— A 2d game engine for making iphone games.
* DikuMUD and derivatives — MUD engines
* dim3 — Freeware 3D javascript engine for the Mac (although finished games are cross platform).
* DX Studio — A freeware 3D game engine with complete tools for 3D video game development. Upgrading to paid licenses would unlock extra features.
* Game Maker Lite — Object-oriented game development software with a scripting language as well as a drag-and-drop interface.
* LPMud and derivatives (including MudOS and FluffOS) — MUD engines
* MUSH — MU* engine
* M.U.G.E.N — A 2D fighting game engine.
* Open Scene Graph — An open source 3D graphics toolkit, used by application developers in fields such as visual simulation, computer games, virtual reality, scientific visualization and modelling.
* Panda3D — (Releases prior to May 28, 2008) A relatively easy to use C++ game engine with Python bindings that was made by Disney and is owned by Carnegie Mellon University. Disney uses it to produce some of their games.
* Platinum Arts Sandbox Free 3D Game Maker — Open source and based on the Cube 2: Sauerbraten engine with a focus on game creation and designed for kids and adults. The program includes Non commercial content, but the main engine and large majority of the media can be used commercially. The Exciting Adventures of Master Chef Ogro was created using this engine by High School students.
* TinyMUCK — MU* engine
* TinyMUD — MU* engine
* Unity — An open-ended 3D game/interactive software engine for web, Windows, and Mac OS X. Upgrading to paid licenses can additionally enable support for the iPhone, Android and Nintendo Wii.
* World Builder — A classic Mac OS game engine.
* Wintermute Engine — A runtime and development tools for creating 2D and 2.5D point'n'click adventure games.[6][7]
* RGSS — An engine made by enterbrain to create RPG's using RPG Maker XP. RGSS2 was used for RPG Maker VX.
[edit] Commercial engines
* Alamo — the engine used in Star Wars: Empire at War by Petroglyph Games.
* Aurora Engine — For Role-playing games.
* Bork3D Game Engine — A cross-platform game engine primarily targeting iPhone and iPad.
* BigWorld — Server, client and development tools for the development of MMOG for games that run on Windows, Xbox 360, and PS3.
* BRender — A real-time 3D graphics engine for computer games, simulators and graphic tools.
* C4 Engine — A cross-platform game engine developed by Terathon Software.
* Cafu Engine — A game engine with development tools for creating multiplayer, cross-platform, real-time 3D games and applications.
* Coldstone game engine — An old game creation suite for Macintosh/Windows to create role-playing or adventure-style games.
* Corona SDK — A cross-platform, Lua-based game engine that can build games to the iPhone, iPad, or Android devices from the same set of code.
* CPAL3D — Complete game creation tools with scene editor, IDE and text server.
* CryEngine, CryEngine 2, CryEngine 3, CryEngine 3.5 — The game engine used for the first-person shooter computer game Far Cry. CryEngine 2 is a new generation engine developed by Crytek to create the FPS game Crysis.
* Crystal Tools — Square Enix's proprietary seventh generation game engine.
* DX Studio — Engine and editing suite that allows creation of real-time games and simulations.
* Dunia Engine — Engine (heavily modified version of the CryEngine) made especially for Far Cry 2 by Ubisoft Montreal.
* Earth-4 Engine — The graphics engine used in Earth 2160
* Electron engine — Developed by Obsidian Entertainment for their game Neverwinter Nights 2, based on the Aurora engine.
* Elflight Engine — Cross-platform 3D streaming game engine designed from the ground up for use over the Web. Games can play in a web browser window, in a separate window or full-screen. Java and OpenGL based.
* Enigma Engine — A real-time tactics game engine, used in Blitzkrieg.
* Esperient Creator — A very powerful 3D modeler and engine, used world wide for training, simulation, architecture, and games. Built-in Scripting, C/C++, CScript, or Lisp, Shader Editor, import 50+ 3D formats.
* Euphoria — This is a biomechanical Ragdoll engine by NaturalMotion.
* Freescape (1986) — Incentive Software; One of the first proprietary 3D game engines, used in Driller and 3D Construction Kit.
* Frostbite Engine — Game engine used for the next-gen title Battlefield: Bad Company.
* Gamebryo — Cross-platform game middleware for professional developers, notable for its rapid development.
* GameSalad — A 2D game engine that currently targets the iPhone and a Apple Safari Web-plugin developed by Gendai Games. Has a visual programming interface that allows for rapid development.
* Gamestudio — A 2D and 3D game engine for beginners. Uses the Gamestudio development system and the lite-C programming language.
* Glacier, Glacier2 — Developed by IO Interactive and used for the Hitman series of games. Glacier2 is a new generation engine currently in development for upcoming games.[8]
* Gogii Games Engine - a 2d multi-platform C++ engine supporting PC, Mac, iPhone and iPad. Used in casual games such as the Mortimer Beckett series.
* GrimE — Used in LucasArts graphical adventure games starting with Grim Fandango.
* Hedgehog Engine — Created by the Sonic Team with the capability of rendering high quality graphics at high speed. It was first used in Sonic Unleashed.
* HeroEngine — 3D game engine by Simutronics for building MMOs in a live collaborative environment.
* HPL Engine 2 — Used in Frictional Games survival horror games. Earlier versions are free software.
* id Tech 4 — (Also known as Doom 3 engine) Used by the games Doom 3, Quake 4, Prey and Quake Wars. Will become Open Source with the release of RAGE in September 2011[9].
* id Tech 5 — Currently in development by id Software as engine for their games, Doom 4 and Rage, and as a general purpose engine to be licensed.
* IMUSE — Specifically designed to synchronize music with visual action.
* Infernal Engine — Created by Terminal Reality, provides rendering, physics, sound, AI, and metrics for game development. Used in several games such as Ghostbusters: The Video Game, Mushroom Men: The Spore Wars, Bass Pro Shops: The Strike and Roogoo: Twisted Towers.[10]
* INSANE — Used in LucasArts games.
* Infinity Engine — Allows the creation of isometric computer role-playing games.
* Jade engine — Developed by Ubisoft, originally for Beyond Good & Evil.
* Jedi — A game engine developed by LucasArts for Star Wars: Dark Forces and Outlaws.
* K2 Engine — An engine used in Heroes of Newerth and Savage2 by S2 Games.
* Kaneva Game Platform — A MMOG engine for independent and professional game development.
* Kinetica — A game engine developed by Sony for PlayStation 2.
* Leadwerks Engine — Leadwerks Engine is a 3D engine for rendering, sound, and physics in real-time games and simulations.
* Lemon Engine — Lemon Engine is a modular set of libraries for all aspects of game development across all major platforms.
* Lithtech Jupiter Ex — Developed by Monolith Productions to create the game F.E.A.R.
* LyN engine — Developed by Ubisoft, originally for Rabbids Go Home and Beyond Good & Evil 2.
* Medusa — A C++ 3D game engine developed by Palestar and used in the DarkSpace MMO. It features distributed world simulation, single tool version control and asset realisation, cross-platform compatibility and an integrated client/server network system.
* Monumental Technology Suite – A MMOG platform, including server and client technology and development / live management tools.
* MT Framework — Game engine created by Capcom and used for their games on Xbox 360, PlayStation 3 and PC.
* Multimedia Fusion 2 — A 2D game development system.
* Multiverse Network — An MMOG platform, including server, client, and tools. (Free for development and use — revenue sharing upon commercial deployment).
* Odyssey Engine — Used to create three dimensional computer role-playing games, used in Star Wars: Knights of the Old Republic
* Onyx Engine — Developed by Ubisoft
* Pie in the Sky — Used in two internal games by Pie in the Sky Software and then in the 3D Game Creation System and the games made with it.
* PhyreEngine — A cross platform (PC & PS3) graphics engine from Sony Computer Entertainment.
* Q (game engine) — A fully pluggable, extensible and customisable framework and tools from Qube Software for PC, Wii, PS2, PS3, Xbox, Xbox 360, PSP, iPhone etc. created by the team behind Direct3D.
* RAGE — A game engine created by Rockstar Games to power their upcoming video games on the Xbox 360 and PlayStation 3. Implemented in Grand Theft Auto 4.
* RelentENGINE — A next-generation FPS engine supporting massive destroyable city environments and realistic vehicle control, makes extensive use of shader model 3.
* RenderWare — A 3D API and graphics rendering engine.
* Revolution3D — A 3D graphics engine developed by X-Dream Project.
* RPG Maker VX — A 2D engine to make top-down and isometric-style role-playing games for Windows.
* RPG Maker XP — A 2D engine to make top-down and isometric-style role-playing games for Windows.
* RPG Maker 2003 — A 2D engine to make top-down and isometric-style role-playing games for Windows.
* RPG Maker 95 — A 2D engine to make top-down and isometric-style role-playing games for Windows.
* SAGE engine — Used to create real-time strategy games.
* Scaleform — A vector graphics rendering engine used to display Adobe Flash-based user interfaces, HUDs, and animated textures for games in PC, Mac, Linux, Xbox 360, PlayStation 2, PlayStation Portable, PlayStation 3, and Wii.
* SCUMM engine — Used in LucasArts graphical adventure games.
* Serious Engine — The engine by Croteam used in the epic Serious Sam: The First Encounter and The Second Encounter.
* Shark 3D — A middleware from Spinor for computer, video games and realtime 3D applications.
* ShiVa — A game engine with an authoring tool to produce 3d real time applications for Windows, Mac OS X, Linux, WebOS, Android, and iPhone.
* Silent Storm engine — A turn-based tactics/tactical RPG game engine, used in Silent Storm.
* Sith — A game engine developed by LucasArts for Jedi Knight: Dark Forces II.
* Source engine — A game engine developed by Valve Software for Half-Life 2.The SDK comes with Half Life 2
* Torque Game Engine — A modified version of a 3D computer game engine originally developed by Dynamix for the 2001 FPS Tribes 2.
* Torque Game Engine Advanced – A next-generation 3D game engine support modern GPU hardware and shaders.
* TOSHI — A fourth generation cross platform game engine designed by Blue Tongue Entertainment.
* Truevision3D — A 3D game engine using the DirectX API.
* Unigine — Cross-platform middleware engine.
* Unity — An open-ended 3D game/interactive software engine for web, Windows, Mac OS X, iOS (iPod, iPhone, and iPad), Android, and Nintendo Wii.
* Unreal Engine — A game engine for PC, Xbox 360 and PlayStation 3 .
* Vengeance engine — A video game engine based on the Unreal Engine 2/2.5
* Vicious Engine — Available for Microsoft Windows, Sony PlayStation 2, Microsoft Xbox, and Sony PlayStation Portable
* Virtools — A 3D engine combined with high-level development framework, used for game prototyping and rapid developments. Available for Windows, Macintosh, Xbox, PSP. Can publish standalone or for the 3DVia Web Player browser plugin.
* Vision Engine 8 — A cross-platform game engine, developed by Trinigy. Used in games such as: Arcania: A Gothic Tale, The Settlers 7: Paths to a Kingdom, Dungeon Hero, Cutthroat, and Three Investigators.
* Visual3D.NET Game Engine — All-in-One 3D game engine and toolset, fully written in C#/.NET for Windows A browser player is roadmapped for v1.1.
* WGAF — The game engine developed by Guild Software which powers their MMORPG Vendetta Online.
* X-Ray — The game engine developed by GSC Game World which powers their FPS series, "S.T.A.L.K.E.R".
* XnGine — Developed by Bethesda Softworks, one of the first true 3D engines.
* Zillions of Games — used to develop games that happen on a grid, like chess
Wednesday, November 3, 2010
Develope iPhone Game with Tilemap
Here are some good videos about tilemap editor and game development which i like most.
Tuesday, November 2, 2010
iPhone Map Kit - Tutorial and Code
I tried looking for some online resources that could be of some help but did not find any. I was not able to find any good tutorial that explains how can an address be shown on a map with the application. Therefore, I decided to write one and here it is. Hope it will be of some help.
Lets create a simple application which displays the address entered by the user on the map within the application. We’ll call it MapApp.
1. First, create a Window based application and name the project as MapApp.
2. Add the MapKit framework to the project. (Control + Click Frameworks folder -> Add -> Existing Frameworks)
3. Create a new view controller class and call it MapViewController. Add a text field, button and map view to it.
#import
#import
@interface MapViewController : UIViewController {
IBOutlet UITextField *addressField;
IBOutlet UIButton *goButton;
IBOutlet MKMapView *mapView;
}
@end
4. Now create a xib file named MapView.xib. Set its type to MapViewController and add a UITextField, UIButton and MKMapView to it.
Make sure you set the delegate for the mapView to the controller class.
5. Once the view is ready, update the MapAppDelegate so that the view controller and the view is loaded.
- (void)applicationDidFinishLaunching:(UIApplication *)application {
mapViewController = [[MapViewController alloc] initWithNibName:@"MapView" bundle:nil];
[window addSubview:mapViewController.view];
[window makeKeyAndVisible];
}
6. Now, build the app and check if the view appears correctly or not. We now have the UI ready for entering the address and button for updating the location in the map.
7. Add the class for showing the annotation on the location. Lets call this class as AddressAnnotation.
@interface AddressAnnotation : NSObject {
CLLocationCoordinate2D coordinate;
NSString *mTitle;
NSString *mSubTitle;
}
@end
@implementation AddressAnnotation
@synthesize coordinate;
- (NSString *)subtitle{
return @"Sub Title";
}
- (NSString *)title{
return @"Title";
}
-(id)initWithCoordinate:(CLLocationCoordinate2D) c{
coordinate=c;
NSLog(@"%f,%f",c.latitude,c.longitude);
return self;
}
@end
This class will basically show the title and the subtitle of the location on the map.
8. Lets add the function that will be called when the ‘Go’ button is tapped and this will contain the code that will actually display the address location on the map. We call that action as showAddress
- (IBAction) showAddress {
//Hide the keypad
[addressField resignFirstResponder];
MKCoordinateRegion region;
MKCoordinateSpan span;
span.latitudeDelta=0.2;
span.longitudeDelta=0.2;
CLLocationCoordinate2D location = [self addressLocation];
region.span=span;
region.center=location;
if(addAnnotation != nil) {
[mapView removeAnnotation:addAnnotation];
[addAnnotation release];
addAnnotation = nil;
}
addAnnotation = [[AddressAnnotation alloc] initWithCoordinate:location];
[mapView addAnnotation:addAnnotation];
[mapView setRegion:region animated:TRUE];
[mapView regionThatFits:region];
}
9. The map view basically shows the location based on its latitude and longitude but we have the address in the textual form. Therefore we need to convert this into CLLocationCoordinate2D. Note that in the above code we call the function names addressLocation to perform this conversion.
-(CLLocationCoordinate2D) addressLocation {
NSString *urlString = [NSString stringWithFormat:@"http://maps.google.com/maps/geo?q=%@&output=csv",
[addressField.text stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]];
NSString *locationString = [NSString stringWithContentsOfURL:[NSURL URLWithString:urlString]];
NSArray *listItems = [locationString componentsSeparatedByString:@","];
double latitude = 0.0;
double longitude = 0.0;
if([listItems count] >= 4 && [[listItems objectAtIndex:0] isEqualToString:@"200"]) {
latitude = [[listItems objectAtIndex:2] doubleValue];
longitude = [[listItems objectAtIndex:3] doubleValue];
}
else {
//Show error
}
CLLocationCoordinate2D location;
location.latitude = latitude;
location.longitude = longitude;
return location;
}
The above code reads the address entered in the input box and gets the location from maps.google.com in CSV format. It then gets the latitude and longitude from it. The return code of 200 from google means success.
10. Finally, lets add the delegate function that will display the annotation on the map
- (MKAnnotationView *) mapView:(MKMapView *)mapView viewForAnnotation:(id) annotation{
MKPinAnnotationView *annView=[[MKPinAnnotationView alloc] initWithAnnotation:annotation reuseIdentifier:@"currentloc"];
annView.pinColor = MKPinAnnotationColorGreen;
annView.animatesDrop=TRUE;
annView.canShowCallout = YES;
annView.calloutOffset = CGPointMake(-5, 5);
return annView;
}
This function basically creates a annotation view (a green color pin) with the annotation that we added earlier to the MapView. Tapping on the green pin will display the title and the sub-title.
So this was a very simple example of how a map can be shown from within an application. Hope this was helpful. Let me know your comments/feedback. Click here to download the code.
UPDATE: All this while Google was not looking for API key in the URL - http://maps.google.com/maps/geo?q=address&output=csv
The URL now needs to change to – http://maps.google.com/maps/geo?q=address&output=csv&key=YourGoogleMapsAPIKey
Lets create a simple application which displays the address entered by the user on the map within the application. We’ll call it MapApp.
1. First, create a Window based application and name the project as MapApp.
2. Add the MapKit framework to the project. (Control + Click Frameworks folder -> Add -> Existing Frameworks)
3. Create a new view controller class and call it MapViewController. Add a text field, button and map view to it.
#import
#import
@interface MapViewController : UIViewController
IBOutlet UITextField *addressField;
IBOutlet UIButton *goButton;
IBOutlet MKMapView *mapView;
}
@end
4. Now create a xib file named MapView.xib. Set its type to MapViewController and add a UITextField, UIButton and MKMapView to it.
Make sure you set the delegate for the mapView to the controller class.
5. Once the view is ready, update the MapAppDelegate so that the view controller and the view is loaded.
- (void)applicationDidFinishLaunching:(UIApplication *)application {
mapViewController = [[MapViewController alloc] initWithNibName:@"MapView" bundle:nil];
[window addSubview:mapViewController.view];
[window makeKeyAndVisible];
}
6. Now, build the app and check if the view appears correctly or not. We now have the UI ready for entering the address and button for updating the location in the map.
7. Add the class for showing the annotation on the location. Lets call this class as AddressAnnotation.
@interface AddressAnnotation : NSObject
CLLocationCoordinate2D coordinate;
NSString *mTitle;
NSString *mSubTitle;
}
@end
@implementation AddressAnnotation
@synthesize coordinate;
- (NSString *)subtitle{
return @"Sub Title";
}
- (NSString *)title{
return @"Title";
}
-(id)initWithCoordinate:(CLLocationCoordinate2D) c{
coordinate=c;
NSLog(@"%f,%f",c.latitude,c.longitude);
return self;
}
@end
This class will basically show the title and the subtitle of the location on the map.
8. Lets add the function that will be called when the ‘Go’ button is tapped and this will contain the code that will actually display the address location on the map. We call that action as showAddress
- (IBAction) showAddress {
//Hide the keypad
[addressField resignFirstResponder];
MKCoordinateRegion region;
MKCoordinateSpan span;
span.latitudeDelta=0.2;
span.longitudeDelta=0.2;
CLLocationCoordinate2D location = [self addressLocation];
region.span=span;
region.center=location;
if(addAnnotation != nil) {
[mapView removeAnnotation:addAnnotation];
[addAnnotation release];
addAnnotation = nil;
}
addAnnotation = [[AddressAnnotation alloc] initWithCoordinate:location];
[mapView addAnnotation:addAnnotation];
[mapView setRegion:region animated:TRUE];
[mapView regionThatFits:region];
}
9. The map view basically shows the location based on its latitude and longitude but we have the address in the textual form. Therefore we need to convert this into CLLocationCoordinate2D. Note that in the above code we call the function names addressLocation to perform this conversion.
-(CLLocationCoordinate2D) addressLocation {
NSString *urlString = [NSString stringWithFormat:@"http://maps.google.com/maps/geo?q=%@&output=csv",
[addressField.text stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]];
NSString *locationString = [NSString stringWithContentsOfURL:[NSURL URLWithString:urlString]];
NSArray *listItems = [locationString componentsSeparatedByString:@","];
double latitude = 0.0;
double longitude = 0.0;
if([listItems count] >= 4 && [[listItems objectAtIndex:0] isEqualToString:@"200"]) {
latitude = [[listItems objectAtIndex:2] doubleValue];
longitude = [[listItems objectAtIndex:3] doubleValue];
}
else {
//Show error
}
CLLocationCoordinate2D location;
location.latitude = latitude;
location.longitude = longitude;
return location;
}
The above code reads the address entered in the input box and gets the location from maps.google.com in CSV format. It then gets the latitude and longitude from it. The return code of 200 from google means success.
10. Finally, lets add the delegate function that will display the annotation on the map
- (MKAnnotationView *) mapView:(MKMapView *)mapView viewForAnnotation:(id
MKPinAnnotationView *annView=[[MKPinAnnotationView alloc] initWithAnnotation:annotation reuseIdentifier:@"currentloc"];
annView.pinColor = MKPinAnnotationColorGreen;
annView.animatesDrop=TRUE;
annView.canShowCallout = YES;
annView.calloutOffset = CGPointMake(-5, 5);
return annView;
}
This function basically creates a annotation view (a green color pin) with the annotation that we added earlier to the MapView. Tapping on the green pin will display the title and the sub-title.
So this was a very simple example of how a map can be shown from within an application. Hope this was helpful. Let me know your comments/feedback. Click here to download the code.
UPDATE: All this while Google was not looking for API key in the URL - http://maps.google.com/maps/geo?q=address&output=csv
The URL now needs to change to – http://maps.google.com/maps/geo?q=address&output=csv&key=YourGoogleMapsAPIKey
Monday, November 1, 2010
Touch Detection in cocos2d iphone example
First you need to set
self.isTouchEnabled = YES; in init method of your layer.
The three approaches are:
1.Dumb input management. This isn't dumb in the sense of stupid, but instead is dumb in the sense of a dumb missile that will keep flying straight until it hits something. A more precise description would be ignorant of global state.
While usually not usable as-is in non-demo applications, this approach underpins the other two approaches, and is thus important.
Simply subclass CocosNode and implement any or all of these three methods (you don't have to define them in the interface, they're already defined by a superclass).
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
oint location = [touch locationInView: [touch view]];
[self doWhateverYouWantToDo];
[self doItWithATouch:touch];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
oint location = [touch locationInView: [touch view]];
[self doWhateverYouWantToDo];
[self doItWithATouch:touch];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
oint location = [touch locationInView: [touch view]];
[self doWhateverYouWantToDo];
[self doItWithATouch:touch];
}
The distinction between the three methods is, touchesBegan is fired when the user first presses their finger on the screen, touchesMoved is fired after the user has pressed their finger on the screen and moves it (but before they pick it up), and touchesEnded is fired when the user picks their finger up.
Using these three methods, you can easily fire actions whenever a Sprite (or any other Cocos2d subclass) is touched. For a simple application that may be sufficient.
2.
Top-down global input management. The next approach allows a very high level of control over handling input, but is prone to creating a monolithic method that handles all input management for your application.
First, it requires that you have references to all Sprite objects that you are interested in detecting input for. You can do that by managing the references manually, or can setup the subclass to track all instances.
You can track instance references fairly easily, modeling after this code:
@interface MySprite : Sprite {}
+(NSMutableArray *)allMySprites;
+(void)track: (MySprite *)aSprite;
+(void)untrack: (MySprite *)aSprite;
@end
And the implementation:
@implementation MySprite
static NSMutableArray * allMySprites = nil;
+(NSMutableArray *)allMySprites {
@synchronized(allMySprites) {
if (allMySprites == nil)
allMySprites = [[NSMutableArray alloc] init];
return allMySprites;
}
return nil;
}
+(void)track: (MySprite *)aSprite {
@synchronized(allMySprites) {
[[MySprite allMySprites] addObject:aSprite];
}
}
+(void)untrack: (MySprite *)aSprite {
@synchronized(allMySprites) {
[[MySprite allMySprites] removeObject:aSprite];
}
}
-(id)init {
self = [super init];
if (self) [MySprite track:self];
return self;
}
-(void)dealloc {
[MySprite untrack:self];
[super dealloc];
}
So, maybe this is a bit of a pain to set up, but it can be pretty useful in other situations as well (like discovering which instances of MySprite are within a certain distance of a point).
Then, you implement the three methods from above in your Scene object, and use it to handle and route clicks.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
NSArray * mySprites = [MySprite allMySprites];
NSUInteger i, count = [mySprites count];
for (i = 0; i < count; i++) {
MySprite * obj = (MySprite *)[mySprites objectAtIndex:i];
if (CGRectContainsPoint([obj rect], location)) {
// code here is only executed if obj has been touched
}
}
}
The advantage of this approach is that you have an extremely granular level of control over input management. If you only wanted to perform actions on touches that touch two instances of MySprite, you could do that. Or you could only perform actions when a certain global condition is activated, and so on. This approach lets you make decisions at the point in your application that has the most information.
But it can get unwieldy depending on the type of logic you want to implement for your user input management. To help control that, I usually roll a simple system for user input modes.
The implementation depends on your specific app, but you'd start by subclassing NSObject into a UIMode object.
@interface UIMode : NSObject {}
-(id)init;
-(void)setupWithObject: (id)anObject;
-(void)tearDown: (UIMode *)nextMode;
-(void)tick: (ccTime)dt;
-(BOOL)touchBeganAt: (CGPoint)aPoint;
-(BOOL)touchMovedAt: (CGPoint)aPoint;
-(BOOL)touchEndedAt: (CGPoint)aPoint;
@end
The implementation of all those classes for UIMode should be inert stubs that can then be overridden in subclasses as appropriate. My system is to have the touch?At methods return YES if they decide to handle a specific touch, and otherwise return NO. This lets user interface modes implement custom logic, or to let a touch pass on to your default touch handling.
Next update the interface for your subclass of Scene like this:
@interface MyScene : Scene {
UIMode * currentMode;
}
-(UIMode *)currentMode;
-(void)setCurrentMode: (UIMode)aMode;
Then, in your implementation you'd add some code along these lines:
-(UIMode *)currentMode {
return currentMode;
}
-(void)setCurrentMode: (UIMode *)aMode {
if (currentMode != nil) {
// this tearDown method is part of the imagined
// UIMode class, and lets a UIMode disable itself
// with knowledge of the subsequent UIMode for proper
// transitions between modes
[currentMode tearDown:aMode];
[currentMode release];
}
currentMode = [aMode retain];
}
Finally, you'd need to update the touchesBegan:withEvent method to query the UIMode whether it wants to handle each specific click.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
// forward the specified location to the UIMode, and abort
// standard click handling if the UIMode decides to handle
// the click
UIMode * uim = [self currentMode];
if (uim != nil && [uim touchBeganAt:location]==YES) return;
NSArray * mySprites = [MySprite allMySprites];
NSUInteger i, count = [mySprites count];
for (i = 0; i < count; i++) {
MySprite * obj = (MySprite *)[mySprites objectAtIndex:i];
if (CGRectContainsPoint([obj rect], location)) {
// code here is only executed if obj has been touched
}
}
}
This is the approach I prefer, because it is fairly simple, and allows an extremely high amount of flexibility. I realize that I dumped a ton of code here, and apologize. Hopefully you can still find the thread of thought intertwined into the jumble.
3.
Bottom-up global input management. I won't provide much code for this approach, as it isn't one that I use, but it's a compromise between the first and second approaches.
For each instance of some MySprite class, override the touchesBegan:withEvent: (and moved and ended variants as well, if you want them) method, and then notify a global object about the touch occuring.
It would look something like this:
-(void)touchesBegan: (NSSet *)touches withEvent: UIEvent *)event {
CurrentScene * s = [self currentScene]; // Not a real method.
[s mySpriteTouched:self];
}
Of course, this means you'd need to pass a reference to the current scene to each instance of MySprite, or you can use a singleton to simplify.
static CurrentScene *sharedScene = nil;
+(CurrentScene *)sharedScene {
@synchronized(self) {
if (sharedScene = nil)
[[self alloc] init];
}
}
return sharedGame;
}
+(void)releaseSharedScene {
@synchronized(self) {
if (sharedScene != nil) [sharedScene release];
sharedScene = nil;
}
}
+(id)allocWithZone: (NSZone *)zone {
@synchronized(self) {
if (sharedScene = nil) {
sharedScene = [super allocWithZone:zone];
return sharedScene;
}
}
return nil;
}
-(id)retain {
return self;
}
-(unsigned)retaiCount {
return UINT_MAX;
}
-(void)release {}
-(id)autorelease {
return self;
}
The code is a bit of a clusterfuck, in my humble opinion, but it is still quite convenient, as it allows us to convert the touchesBegan:withEvent method to this:
-(void)touchesBegan: (NSSet *)touches withEvent: UIEvent *)event {
[[CurrentScene sharedScene] mySpriteTouched:self];
}
And we don't have to explicitly pass the reference to the CurrentScene instance to each instance of MySprite. Objective-C has a lot of these painful pieces of code that are rather annoying to implement, but can save a lot of effort once they are implemented. My advice is to use them, early and infrequently.
Well, there you have it, three approaches to handling touch detection for Cocos2d iPhone, presented in a confusing and at most halfway organized article.
self.isTouchEnabled = YES; in init method of your layer.
The three approaches are:
1.Dumb input management. This isn't dumb in the sense of stupid, but instead is dumb in the sense of a dumb missile that will keep flying straight until it hits something. A more precise description would be ignorant of global state.
While usually not usable as-is in non-demo applications, this approach underpins the other two approaches, and is thus important.
Simply subclass CocosNode and implement any or all of these three methods (you don't have to define them in the interface, they're already defined by a superclass).
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
oint location = [touch locationInView: [touch view]];
[self doWhateverYouWantToDo];
[self doItWithATouch:touch];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
oint location = [touch locationInView: [touch view]];
[self doWhateverYouWantToDo];
[self doItWithATouch:touch];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
oint location = [touch locationInView: [touch view]];
[self doWhateverYouWantToDo];
[self doItWithATouch:touch];
}
The distinction between the three methods is, touchesBegan is fired when the user first presses their finger on the screen, touchesMoved is fired after the user has pressed their finger on the screen and moves it (but before they pick it up), and touchesEnded is fired when the user picks their finger up.
Using these three methods, you can easily fire actions whenever a Sprite (or any other Cocos2d subclass) is touched. For a simple application that may be sufficient.
2.
Top-down global input management. The next approach allows a very high level of control over handling input, but is prone to creating a monolithic method that handles all input management for your application.
First, it requires that you have references to all Sprite objects that you are interested in detecting input for. You can do that by managing the references manually, or can setup the subclass to track all instances.
You can track instance references fairly easily, modeling after this code:
@interface MySprite : Sprite {}
+(NSMutableArray *)allMySprites;
+(void)track: (MySprite *)aSprite;
+(void)untrack: (MySprite *)aSprite;
@end
And the implementation:
@implementation MySprite
static NSMutableArray * allMySprites = nil;
+(NSMutableArray *)allMySprites {
@synchronized(allMySprites) {
if (allMySprites == nil)
allMySprites = [[NSMutableArray alloc] init];
return allMySprites;
}
return nil;
}
+(void)track: (MySprite *)aSprite {
@synchronized(allMySprites) {
[[MySprite allMySprites] addObject:aSprite];
}
}
+(void)untrack: (MySprite *)aSprite {
@synchronized(allMySprites) {
[[MySprite allMySprites] removeObject:aSprite];
}
}
-(id)init {
self = [super init];
if (self) [MySprite track:self];
return self;
}
-(void)dealloc {
[MySprite untrack:self];
[super dealloc];
}
So, maybe this is a bit of a pain to set up, but it can be pretty useful in other situations as well (like discovering which instances of MySprite are within a certain distance of a point).
Then, you implement the three methods from above in your Scene object, and use it to handle and route clicks.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
NSArray * mySprites = [MySprite allMySprites];
NSUInteger i, count = [mySprites count];
for (i = 0; i < count; i++) {
MySprite * obj = (MySprite *)[mySprites objectAtIndex:i];
if (CGRectContainsPoint([obj rect], location)) {
// code here is only executed if obj has been touched
}
}
}
The advantage of this approach is that you have an extremely granular level of control over input management. If you only wanted to perform actions on touches that touch two instances of MySprite, you could do that. Or you could only perform actions when a certain global condition is activated, and so on. This approach lets you make decisions at the point in your application that has the most information.
But it can get unwieldy depending on the type of logic you want to implement for your user input management. To help control that, I usually roll a simple system for user input modes.
The implementation depends on your specific app, but you'd start by subclassing NSObject into a UIMode object.
@interface UIMode : NSObject {}
-(id)init;
-(void)setupWithObject: (id)anObject;
-(void)tearDown: (UIMode *)nextMode;
-(void)tick: (ccTime)dt;
-(BOOL)touchBeganAt: (CGPoint)aPoint;
-(BOOL)touchMovedAt: (CGPoint)aPoint;
-(BOOL)touchEndedAt: (CGPoint)aPoint;
@end
The implementation of all those classes for UIMode should be inert stubs that can then be overridden in subclasses as appropriate. My system is to have the touch?At methods return YES if they decide to handle a specific touch, and otherwise return NO. This lets user interface modes implement custom logic, or to let a touch pass on to your default touch handling.
Next update the interface for your subclass of Scene like this:
@interface MyScene : Scene {
UIMode * currentMode;
}
-(UIMode *)currentMode;
-(void)setCurrentMode: (UIMode)aMode;
Then, in your implementation you'd add some code along these lines:
-(UIMode *)currentMode {
return currentMode;
}
-(void)setCurrentMode: (UIMode *)aMode {
if (currentMode != nil) {
// this tearDown method is part of the imagined
// UIMode class, and lets a UIMode disable itself
// with knowledge of the subsequent UIMode for proper
// transitions between modes
[currentMode tearDown:aMode];
[currentMode release];
}
currentMode = [aMode retain];
}
Finally, you'd need to update the touchesBegan:withEvent method to query the UIMode whether it wants to handle each specific click.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
// forward the specified location to the UIMode, and abort
// standard click handling if the UIMode decides to handle
// the click
UIMode * uim = [self currentMode];
if (uim != nil && [uim touchBeganAt:location]==YES) return;
NSArray * mySprites = [MySprite allMySprites];
NSUInteger i, count = [mySprites count];
for (i = 0; i < count; i++) {
MySprite * obj = (MySprite *)[mySprites objectAtIndex:i];
if (CGRectContainsPoint([obj rect], location)) {
// code here is only executed if obj has been touched
}
}
}
This is the approach I prefer, because it is fairly simple, and allows an extremely high amount of flexibility. I realize that I dumped a ton of code here, and apologize. Hopefully you can still find the thread of thought intertwined into the jumble.
3.
Bottom-up global input management. I won't provide much code for this approach, as it isn't one that I use, but it's a compromise between the first and second approaches.
For each instance of some MySprite class, override the touchesBegan:withEvent: (and moved and ended variants as well, if you want them) method, and then notify a global object about the touch occuring.
It would look something like this:
-(void)touchesBegan: (NSSet *)touches withEvent: UIEvent *)event {
CurrentScene * s = [self currentScene]; // Not a real method.
[s mySpriteTouched:self];
}
Of course, this means you'd need to pass a reference to the current scene to each instance of MySprite, or you can use a singleton to simplify.
static CurrentScene *sharedScene = nil;
+(CurrentScene *)sharedScene {
@synchronized(self) {
if (sharedScene = nil)
[[self alloc] init];
}
}
return sharedGame;
}
+(void)releaseSharedScene {
@synchronized(self) {
if (sharedScene != nil) [sharedScene release];
sharedScene = nil;
}
}
+(id)allocWithZone: (NSZone *)zone {
@synchronized(self) {
if (sharedScene = nil) {
sharedScene = [super allocWithZone:zone];
return sharedScene;
}
}
return nil;
}
-(id)retain {
return self;
}
-(unsigned)retaiCount {
return UINT_MAX;
}
-(void)release {}
-(id)autorelease {
return self;
}
The code is a bit of a clusterfuck, in my humble opinion, but it is still quite convenient, as it allows us to convert the touchesBegan:withEvent method to this:
-(void)touchesBegan: (NSSet *)touches withEvent: UIEvent *)event {
[[CurrentScene sharedScene] mySpriteTouched:self];
}
And we don't have to explicitly pass the reference to the CurrentScene instance to each instance of MySprite. Objective-C has a lot of these painful pieces of code that are rather annoying to implement, but can save a lot of effort once they are implemented. My advice is to use them, early and infrequently.
Well, there you have it, three approaches to handling touch detection for Cocos2d iPhone, presented in a confusing and at most halfway organized article.
Subscribe to:
Posts (Atom)