/* Code Musings */

Working with the Blackberry Platform Services (BPS), Part 2

The last post in this series talked about setting up BPS and getting an event loop running. In this post, I’ll detail out how screen event handling might work as well as any miscellaneous bits of code that you might need.

If you recall, after the screen has been initialized and your window properties set, you create a window buffer to to draw to. If you’re writing an OpenGL application, this would be the time you use egl to create your rendering context. I won’t get into that in this series because the setting up egl is fairly straight forward and universal. Any OpenGL ES tutorial can show you how it’s done. There is one bit of code that should be of interest, and that is the calculation of DPI. Thankfully, the BPS system allows you to query the device for its screen size and resolution so that you could derive the DPI.

Screen events are what most developers will be interested in, so I’ll start there. As mentioned previously, in order to receive events, the code will need to poll the system for them. Once a bps event has been received, you will need to translate it into a screen event so that useful data could be extracted.

A while loop is needed because many screen events could be chained together inside one bps event. Screen events can have many flavours, such as touch events, mouse events (if you’re using a simulator or bluetooth mouse), and Navigator events (such as application close). To handle these, you would query for the event property using screen_get_event_property_iv().

The Navigator is the windowing system on the BB10 platform. It notifies you of orientation changes, window state changes (ie. minimized, full screen, etc), and so on. The one thing that bugged me is the fact that the screen handling code is responsible for determining if the application is being terminated. This sounds like it should have been the job of the navigator instead. Perhaps there will be a change in this API in later releases, but for now, the line or responsibility between the two is slightly blurred. I will talk about handling Navigator events later on.

Handling touch events was a bit buggy when I first tried it out (prior to NDK 2.0). I had a hard time determining when a finger left the screen after a touch, so I instituted a workaround, or hack, to mask over this problem. I also filed a bug report, but have not followed up with it. It is a minor bug. In hindsight, it might have been the beta build of the NDK that was the issue.

Mouse pointer handling code may not be as useful, but should be handled. Even if the event is redirected to simulate a single touch event.

Keyboard input is seems easy up front, but it gets complex the more you read into the various types of keyboard layouts and locale settings. The BlackBerry website has a great write-up on how to use the keyboard effectively.

You can also determine if the key is being pressed down or up or if it is being repeated by checking the Flags bitmask for KEY_DOWN and KEY_REPEAT respectively.

This pretty much encapsulates screen event handling, which is really the bulk of the event handling code. I’ll touch upon handling other events, such as the Navigator’s events and gestures in a later post.

Leave a Reply