Friday 18 June 2010

Vote!

Neuroimages are quite large and the app will therefore be quite memory demanding. Also, the older versions of iPhone/iPod touch do not support OpenGL|ES 2, which is something I'm planning to use for 3D textures in the volume rendering view. For this reason, I was thinking I should only support iDevices having 256 MB of memory or more. This means I will not support iPhone 3G or 2G.

However, I do listen to you. If many of you really need iPhone 3G support, I will try my best to support it. But there's no point in building in support for devices having less than 256 MB of memory if you are all sitting on the latest hardware anyway.

I have created a poll, where you can vote about which devices you would like to be supported. If I see a big demand of iPhone 3G/2G support, I will probably support it. However, it means I have to put more efforts into supporting old hardware and get less time for advanced features.

You can select multiple choices, such as both iPhone 3Gs/4 and iPad. I've given my vote. :)

Monday 14 June 2010

HBM poster

Now you can download the NeuroPub poster from HBM 2010. You will find the link under About NeuroPub.

Thanks everyone for all the feedback I received during my poster session!

Sunday 6 June 2010

At #OHBM2010 - the iPad visualiser finally works

I have finally arrived to the conference. There are still some work to do before I can release the iPad version of the app, but I made some progress yesterday. Now I can move the crosshair in the different views and the slice positions will change accordingly. The crosshair also moves in the 3D view. The viewer is still a bit slow when you change slice location in the 2D views, while the 3D view is very smooth and fast. This is probably easy to optimise.

If you want to see the app in action and happen to be at #OHBM2010, please visit the poster #948 during the poster session on Wednesday & Thursday afternoon.

Wednesday 2 June 2010

eBooks and eMags with interactive contents - the future for scientific journals?

The iPad is a great pdf-viewer and I just love reading scientific papers with the Papers app. But it's just static pages so I can't do much more than if I print it. Then, I downloaded the Wired magazine app and it's really nice. Take a look at the two YouTube movies below.






The thing with the iPad as a magazine reader is the interactivity between user and contents. This makes me wonder what the future for scientific journals could be? Instead of having a few static brain images, why not create an eJournal that lets you browse through the brain and look at the statistical images from any view? Think of NeuroImage Digital Edition - how cool wouldn't it be to slice through the brain with your finger when you look at a specific figure?

We are almost there. Journals today allows you to send up extra digital material with your papers, such as movie clips. But it can be done better. If Wired can do an interactive magazine for iPad, so can NeuroImage and any other scientific journal.

I would really like to see something similar to NeuroPub in an eJournal. The idea would be that you just double tap on a figure that you want to interact with, for instance a 3D volume rendering of the brain. Then you immediately get into interactive mode and can rotate the brain in any direction in real time to see how the activation pattern looks like from another view.

Why stop there? Since the iPad has Internet capability, you could also let other readers comment on your paper, directly in the eJournal itself!

This is also a little bit how I perceive the future for NeuroPub. The first version will just be a simple viewer, but future versions could allow you to chat with other users through Twitter and even share your statistical images with others, perhaps with an interactive method description which points out some interesting locations in the brain that the author finds most interesting.

Tuesday 1 June 2010

Creating the poster

I finally have the image I need for the poster that I will present in Barcelona. It looks quite nice.


It took some time to get the crosshair to work properly, but now it's possible to select a coordinate in MNI space and see the location in both 2D and 3D.

Viewer is taking form

This is how the viewer will look like. The image list is on the left side, while the viewer is on the right. A typical iPad split view in other words. The viewer is divided into four subviews, which will contain a coronal, sagittal, axial, and a 3D view.

Only the 3D view is implemented so far, but this is the most complicated view on the other side, so I'm happy that I've finished that part. The iPad is really fast and there's no problem doing volume rendering in real time on the device. You just slide the finger on top of the view to rotate the brain.