Monday 6 December 2010

Compressed NIfTI works!

This was easier than I thought. :-) Now I just have to make it possible to download NIfTI images from Safari or Mail. I think that will be enough for version 1.1.

Working on supporting nii.gz

I want to support compressed NIfTI for the next release. That is what I'm currently working on. This will make it more convenient to send images through email directly to the iPad, or accessing a web page with compressed NIfTI images and downloading them to the app.

Personally, I'm thinking about setting up a local web site where I can access the images I produce in my studies. Then I can browse this archive and download the images I need to the NeuroPub Visualizer, without going through a computer.

Thursday 2 December 2010

Join the NeuroPub forum at NITRC

I have started a forum at NITRC. This is a good place to have an open discussion about the app. You can find the forum here: NeuroPub forum

Float format

The current version (v1.0) will only read NIfTI images in float format. I use FSL to convert my images to this format. This is what I do to convert:

fslmaths in.nii out.nii -odt float

The output image (out.nii) will now be in 32-bit float format.

Version 1.0 now available on App Store

The NeuroPub Visualizer is finally released. You can get it here: NeuroPub Visualizer

Monday 29 November 2010

Associating NIfTI as file type in iOS

I'm currently working on a study and I have been using the NeuroPub Visualizer to review the results. It works great and it's easy to shift between different images. Uploading images through iTunes is easy too, but it requires me to hook up the iPad to my computer with the USB-cable.

I would prefer to download my images wirelessly from my local web server. The next step is therefore to associate the app with NIfTI as file type. This would open up a number of possibilities:
  1. If you get a NIfTI-image as attachment in an email, you can load it into the app directly from the Mail app. This way, you can either send NIfTI-images as email from your computer to load images into the visualizer, or you can receive images from colleagues by email to your iPad. You don't have to go through a computer.
  2. It will also be possible to download NIfTI-images to the app through Safari. Your research group could have a local repository of NIfTI-images within your intranet as a web page that you can access from your iPad.
  3. You will be able to send NIfTI-images by email from the app, given that they are not too big. I only need to figure out how to support compressed NIfTI (.gz), because uncompressed NIfTI would be too big.
In the end, this makes data sharing easier than ever before. Sharing your statistical results with the scientific community is the next step from here, where you can upload your results and download others results. More about that later.

Thursday 25 November 2010

About next versions

I thought I should mention something about the next versions that I'm currently working on.

  1. The version that has been submitted to App Store (version 1.0) is only for iPad, but version 1.1 will be universal and support both iPhone and iPad. It will also make use of the retina display if you have an iPhone 4, but it will also work on iPhone 3Gs. I will probably not support iPhone 3G.
  2. I will support more resolutions, so you can look at any image with a 1x1x1 resolution or lower. The current version only supports 2x2x2. This will be added in version 1.2.
  3. You will also be able to upload and use your own study specific templates in version 1.2.
  4. The current version only allows you to look at one image at a time, but it will be possible to load two or more images as overlays on the template in a future version. This means you will be able to compare different activation images in the same view. This feature will probably be added in version 1.3.
  5. I'm planning to create a version for Mac OS X and release it on Mac App Store, but my focus is to add the above features first.
Do you have any ideas? Feel free to comment and discuss.

Saturday 20 November 2010

Version 1.0 is submitted to App Store!

Exciting news! I have finally submitted the app to App Store. I'm calling it NeuroPub Visualizer. The review process usually takes up to two weeks, but I will let you know when it's available.

This is how the icon looks like:

Friday 19 November 2010

Version 1.0 is finally finished!!!

I'm happy to announce that version 1.0 is finally finished. It has been a long ride and I had hoped to release it sooner, but good things sometimes take time. I will submit it to App Store during the weekend. Then I will start to work on the help text that you will find on this blog.





Help

About NeuroPub

NeuroPub is a visualizer for statistical brain images (fMRI, VBM, etc) and other kind of images (atlases etc) that can be visualised as an overlay on top of the standard MNI brain. It keeps a list of all the images you have imported to the app, so you always have immediate access to your research. This makes it a great tool to bring to conferences and meetings. You can have your own library of statistical images that you always carry with you, so you don't miss a chance of demonstrating your latest results when meeting others in the neuroimaging field.

This help text describes version 1.2.

Starting the app for the first time

Both the iPhone and the iPad version have exactly the same features, but there are some minor differences. These which will be described below.

iPhone

When you start the app the first time on your iPhone, you will see an image list containing one file (example.nii.gz). If you tap on that image, you will get into the visualizer, which will visualise the example image as an overlay on the standard brain.

There are two buttons at the bottom of the image list: Reload and Help. The Help button brings you to this page. The Reload button will be further discussed under the section Controls and buttons.

iPad

When you start the app on your iPad, you will immediately get into the visualizer. If you start the app in landscape mode, you will also see the image list on the left side. Just like on the iPhone, this list will contain one file (example.nii.gz) and the image will get loaded if you tap on it. The image list will not be visible if you start the app in portrait mode, but you can then invoke the list by tapping on the Image List button that you find in the upper left corner.

The iPad version has only one button at the bottom of the image list (Reload). The Help button is located at the upper right corner.

Both iPhone & iPad


The example image (example.nii.gz) is included with the app and listed in the image list when you start the app for the first time. You cannot delete this image, but it will disappear from the image list as soon as you upload your own images.


Uploading images

Requirements

NeuroPub supports images in both .nii and .nii.gz (compressed) format. Images you upload must have the same resolution as the 2x2x2 mm^3 template that comes with SPM and FSL, and they need to be in float-format. Only images satisfying these demands will be listed in the image list. The coordinate transformation matrix must also be the same as the template. If you have FSL, you can use the fslhd command to check if your file fulfils the requirements. You should get these values:

data_type      FLOAT32

dim0           3


dim1           91
dim2           109
dim3           91
dim4           1

The data type must be FLOAT32 and the images must have the size 91x109x91. You cannot upload 4D files. Only 3D files are accepted.

pixdim1        2.0000000000
pixdim2        2.0000000000
pixdim3        2.0000000000

The voxel size must be 2x2x2 mm^3.

sto_xyz:1      -2.000000  0.000000  0.000000  90.000000
sto_xyz:2      0.000000  2.000000  0.000000  -126.000000
sto_xyz:3      0.000000  0.000000  2.000000  -72.000000
sto_xyz:4      0.000000  0.000000  0.000000  1.000000
sform_xorient  Right-to-Left

Finally, the sform matrix must be equal to the values above. Notice that the diagonal is -2, 2, 2. If your diagonal for some reason reads 2, 2, 2, your x-orientation is Right-to-Left. All images must be in the Left-to-Right format.

Please note that if your image does not conform to these requirements, you won't see it in the image list and it will be deleted from the app to save space!


There are two ways to get images to NeuroPub:

Import images from other apps

This is the preferred method to get images to NeuroPub. You can import images from apps such as Safari, Mail, Dropbox, etc. For instance, you can send yourself an email to your iPhone/iPad with your NIfTI files (both .nii and .nii.gz). You can then tap on the image and get a list of apps that can read NIfTI. NeuroPub will be one of the apps. Just select NeuroPub and the image will be loaded into the image list.

It doesn't matter which app you use to upload images to NeuroPub. You can use Mail, Safari, Dropbox or any other app that can export files.

Upload images through iTunes File Sharing

An alternative way to upload images is to use iTunes File Sharing. This is the same procedure as for uploading documents to Apple's Pages app. Please look at Apple's support page to see how this is done: http://support.apple.com/kb/HT4088

The viewer

If you tap on the example.nii.gz (or any image you have uploaded), it will be loaded into the viewer and visualised as an overlay on the MNI template. You will get into a 2x2 view mode, where you can see the brain in a axial, coronal, sagittal, and 3D view at the same time.

You can now drag the red cursor in the different views to change coordinate and slice locations. The slice locations will change as you move the cursor. Thus, to change slice location in the axial view, you have to move the cursor up or down in the coronal or sagittal view. You can also rotate the 3D view by dragging your finger over the brain. Multi-touch is not included in this version of the app, so you won't be able to pinch in any of the views.

Change view by double tapping

You can go from 2x2 mode to any of the sub-views by double tapping on the view you want to see in more detail. This view will then take over the whole screen. For instance, try double tapping on the axial view. You should now see the axial view over the whole screen. In single view mode, you can change slice location by moving your finger up and down farthest to the right of the screen.You can go back to 2x2 mode by double tapping again on the screen. This way, you can move quickly between the different views.

The 3D brain

The viewer is performing volume rendering in real time. This is done by downsampling the brain to 64x64x64. The viewer displays these as a stack of 2D slices in each direction, which is why your might see some artefacts at an angle of 45° where you are in between two directions (e.g. when you are in between axial and sagittal).

Left is left and right is right - neurological convention

I did not add labels on what is left and right in the viewer, but you can see that by looking at where the cursor is in the 3D view. I have implemented the neurological convention, which means left is left and right is right. Labels will come in the next version.

Controls and buttons


Min/Max Apply

This is where you can enter your own threshold levels of your image. Tap on Apply (or hit return on the keyboard) to apply the new threshold settings. Voxels below the min level will be removed. Voxels above the max level will by default still be visible, but will get the same colour as the voxels at the max level. You can make the voxels above the max level disappear if you like by turning on the Upper threshold tool (read more about this below).

Reset

The tool will automatically set default min and max threshold values the first time you load an image. This is not done when you switch between images. If you have set a min value of 2.3 in one image and you switch to another image, the same min value will be applied on this image too.

However, you can reset the settings to the min and max values of the image by tapping the Reset button, just like when you load an image the first time.

X Y Z

The MNI coordinate and the corresponding voxel value of your selected image are printed next to the Reset button on the iPad and next to the Tools button on the iPhone.


Edit 

The edit button is found in the left corner of the image list. It allows you to delete images from your image list. This button will be disabled if you haven't uploaded any of your own images. The example.nii.gz image cannot be deleted, but will be hidden once you have added your own images.

Reload

If you have uploaded your own NIfTI images and you don't see them in the list, tap on the reload button to make them appear. If they still don't appear, the reason is probably that they don't fulfil the requirements.

View

This button gives you a menu of the different views (2x2, Axial, Coronal, Sagittal, 3D). This is an alternative way of moving between the views. However, as explained in The Viewer section, the quickest way is to move between the views by double tapping.

Colour

This button gives you a menu with different colour maps (Hot, Jet, Autumn, Cool). You can thus change the colour of the overlay by choosing any of these colour maps.

Tools

This button gives you a menu with a number of different tools, which will be described in detail below:

Cursor on/off

You can turn on or off the cursor with this tool. You can still move around the cursor if you turn it off. It's just that it's invisible to you. This is a feature you can use if you want to make a screenshot and don't like to have the cursor overlapping the image.

3D Transparency on/off

This tool allows you to make the 3D brain transparent so you can see regions (e.g. statistical activations) inside the brain. If you have this feature turn off, you will only see the overlay image rendered on the surface of the brain.

This is how it looks like with transparency turned off.

With transparency turned on, you can see the overlay image inside the brain.


3D Overlay alpha on/off

Normally, all the voxels above the minimum threshold level will be 100% opaque and all the voxels below the min level will not be seen at all. If you turn on this tool, the transparency level will be changed so that voxels close to the min level will be more transparent and voxels close to the max level will be more opaque. The image will look more soft with the alpha turned on, which might look better for some images. It is best to use this feature with 3D Transparency turned on.

This image shows how the brain looks like without the Overlay alpha feature turned on.

This image shows how the brain looks like with the Overlay alpha feature turned on.




Overlay mask on/off

If you haven't masked your images, you might have voxels outside the standard brain that are above your minimum threshold value. These voxels will not be shown in the 3D view, but they will be visible in the 2D views. You can mask the overlay image so that only voxels inside the brain are visible in the 2D views by using this tool.

Upper threshold on/off

This feature will allow you to make voxels above the max level to disappear. You can thus remove all positive voxels in a statistical image if you only want to look at negative voxels. If you have loaded an atlas, you can remove all voxels below and above a certain value to make sure you only see the region associated with that value. For instance, if you have voxels with higher values than 2.0 and you enter a max value of 2.0, you will not see these voxels.

Seed voxel in Neurosynth (available in v1.2.2, which is submitted to App Store)

Neurosynth is a database for peak coordinates and corresponding meta-data. With this tool, you can automatically look up co-activation maps in Neurosynth given the MNI-coordinate of the cursor. NeuroPub will open Safari with the corresponding Neurosynth homepage, from which you will be able to download a NIfTI file and open it in NeuroPub. Neurosynth is storing seed-voxel maps with a resolution of 4 mm, so NeuroPub will localise the nearest map given the current coordinate. The maps are in 2x2x2 mm^3.

Email image

This tool will allow you to send the selected image with Email. Both compressed and uncompressed files are supported. For instance, if you are at a conference presenting your latest brain study, you might want to share your statistical results with people that you meet. You can then easily send your NIfTI file to their email address and they can immediately view it on their iPhone if they have installed NeuroPub.

Feel free to comment on this help text if you have any questions.

Progress update

I have come quite far with the viewer the last few days. The screenshot below show the latest version:


Click on the image for a bigger version.

The viewer will now display a jetmap instead of a blue colour, which I think is nice. You can control the upper and lower thresholds by typing them into the textfields above the viewer. The tool is now also printing the MNI-coordinates and the value of the voxel.

There are still a few things to do, but I should be able to upload the tool to App Store really soon!!!

Thursday 18 November 2010

Template and other things

I have decided which standard brain that will be included with the tool. It will be the ICBM 2009a non-linear asymmetric template (Copyright (C) 1993-2004 Louis Collins, McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University), subsampled to 2x2x2 resolution. This standard brain is included in FSL (avg152T1_brain.nii.gz).

Some limitations for the first version:
  1. The dimensions of the standard template are 91x109x109, subsampled to 2x2x2 mm^3. The coordinate system is Right-to-Left. Images you add have to be in the same space with the same dimensions as the template. This is the standard resolution that comes with SPM and FSL, so it shouldn't be any problems.
  2. All images you add have to be in float format. You need to convert them to float if you have byte format or 16-bit format.
  3. The app will check for this in the image header and only list images that fulfil this requirement. Only uncompressed .nii files will be supported.
One change is that the tool will support thresholding of images. Instead of visualising all voxels above 0 in the same colour, the app will show the images in a jet-map and you will set the threshold yourself. I'm currently adding this feature and I think it will be quite useful.

Friday 12 November 2010

iTunes upload works

It's now possible to upload images to the app through iTunes. A release is getting closer.

Friday 5 November 2010

Version 1.0 getting closer - but will not support compressed NIfTI.

I don't have any problems reading uncompressed NIfTI, but I can't get it to work when the image has been compressed. Support for compressed NIfTI will have to wait for this reason. It will come in version 1.1 instead.

The first version will be quite simple anyway. You will be able to upload NIfTI images that have the same format as the 2x2x2 standard brain but only voxels above zero will be visualised and they will all have the same colour. It's better to get this version out soon with limited features.

These are the things that I have left to do:
  1. Change standard brain. I have been using a custom made standard brain. It will be changed to the MNI brain.
  2. Limit the user from doing "stupid" things, like moving the cursor outside the brain.
  3. Making it possible to upload NIfTI files through iTunes and list them in the files table. This is the part that still requires some development, but it should be quite easy to implement.
I'll keep you posted on the progress.

Friday 29 October 2010

neuropub.com is up again

You might have noticed that neuropub.com has been unreachable for some time. The reason is I didn't pay the fee for the domain name in time so I lost it. Then it took more than a month to get it back again. I won't make the same mistake twice.

The app itself is delayed - again. I'm working on it on my spare time, which is why it is delayed. Now I'm hoping for a release in December. I'm also hoping to get more time for this project in the near future.

Tuesday 17 August 2010

3D glasses for NeuroPub will only work with iPad and iPhone 4

It turns out that it's not possible for an app to detect and use the external video connector on an iPhone 3GS, which is really a pitty. Apple only allows apps running on iPad and iPhone 4 to use the external video.

I have spent some time to really get this to work on my iPhone 3GS. Now I really want an iPhone 4. Oh well, I can still try the iPhone app on my iPad for this specific purpose. I should add that the 3D glasses work just fine on my iPhone 3GS when using them to watch movies from the iPod app. It's just that they cannot be used from the NeuroPub app. For that, you need an iPad or an iPhone 4.

Thursday 5 August 2010

3D stereo graphics from your iPad - it works!

After writing the previous post I couldn't resist so I had to make an iPhone app to see if I could get true stereo graphics from the iPhone to my Vuzix 3D glasses. I still have some problems getting it to work on my iPhone. It refuses to recognise the glasses as a second screen. I should download the latest version of the iPhone OS (iOS) to see if the problem persists.

Luckily, it works beautifully on my iPad, even though it's an iPhone app. It's really cool. By making the brain a little bit transparent it's possible to see half way through the brain. The stereo aspect does make it easier to realise how deep inside the brain the activations are (or whatever you are observing).

This technique is not limited to the 3D glasses. If you own an LCD-TV with support for 3D stereo, it will hopefully support the side-by-side stereo format that the app is using. Then you can connect your iPad to your TV and watch the 3D brain coming out of your screen.

The image below shows what it looks like. The side-by-side technique means the left part of the screen goes to your left eye and the right part goes to your right eye. You can actually see the effect yourself by looking at the picture below if you are good at crossing your eyes. Try look behind the screen to overlap the two brains and you should get a third brain image in the middle in stereo.

Wednesday 4 August 2010

3D stereo graphics from your iPhone

This year is the year of 3D stereo TVs. In 2009 we went to the movie theatres to watch movies like Avatar in true 3D, and now we can buy our own 3D LCD TV to watch 3D movies at home. It's an interesting development for brain image visualisation. I played with this a couple of years ago by using active shutter glasses for CRT-screens.

One thing that has drawn my attention is the kind of 3D glasses with built in LCD-screen that you can connect to your iPhone or iPad. They are actually meant to be used for watching movies from your iPhone, but they also support 3D stereo graphics.


It's actually possible to send 3D stereo graphics from any iPhone app to these kind of glasses, given that the developer has implemented such a feature. I actually bought a pair just to play with the idea. They are quite cheap. The pair you see in the image is the Cinemizer Plus from Carl Zeiss. They only cost about €300.

I actually bought the Vuzix iWear instead, which is about the same price. I believe that the Cinemizer Plus might be better so I'm thinking about buying these too in order to compare. I have only watched 3D YouTube videos from my iPhone so far, but some of them are quite impressive.

What do you think? Would you get a pair of these if the NeuroPub app supports them for 3D volume rendering? Think of being able to watch your statistical results in true 3D stereo from your iPhone. Please feel free to comment or vote on the right side of the blog.

Monday 2 August 2010

Aiming for release in late August

The plan was to release the app within a couple of weeks after the HBM conference, but I didn't get the time. Now I'm aiming for a release this month instead. The first version will be quite simple, but I have many interesting ideas for upcoming versions. More about that later.

Friday 18 June 2010

Vote!

Neuroimages are quite large and the app will therefore be quite memory demanding. Also, the older versions of iPhone/iPod touch do not support OpenGL|ES 2, which is something I'm planning to use for 3D textures in the volume rendering view. For this reason, I was thinking I should only support iDevices having 256 MB of memory or more. This means I will not support iPhone 3G or 2G.

However, I do listen to you. If many of you really need iPhone 3G support, I will try my best to support it. But there's no point in building in support for devices having less than 256 MB of memory if you are all sitting on the latest hardware anyway.

I have created a poll, where you can vote about which devices you would like to be supported. If I see a big demand of iPhone 3G/2G support, I will probably support it. However, it means I have to put more efforts into supporting old hardware and get less time for advanced features.

You can select multiple choices, such as both iPhone 3Gs/4 and iPad. I've given my vote. :)

Monday 14 June 2010

HBM poster

Now you can download the NeuroPub poster from HBM 2010. You will find the link under About NeuroPub.

Thanks everyone for all the feedback I received during my poster session!

Sunday 6 June 2010

At #OHBM2010 - the iPad visualiser finally works

I have finally arrived to the conference. There are still some work to do before I can release the iPad version of the app, but I made some progress yesterday. Now I can move the crosshair in the different views and the slice positions will change accordingly. The crosshair also moves in the 3D view. The viewer is still a bit slow when you change slice location in the 2D views, while the 3D view is very smooth and fast. This is probably easy to optimise.

If you want to see the app in action and happen to be at #OHBM2010, please visit the poster #948 during the poster session on Wednesday & Thursday afternoon.

Wednesday 2 June 2010

eBooks and eMags with interactive contents - the future for scientific journals?

The iPad is a great pdf-viewer and I just love reading scientific papers with the Papers app. But it's just static pages so I can't do much more than if I print it. Then, I downloaded the Wired magazine app and it's really nice. Take a look at the two YouTube movies below.






The thing with the iPad as a magazine reader is the interactivity between user and contents. This makes me wonder what the future for scientific journals could be? Instead of having a few static brain images, why not create an eJournal that lets you browse through the brain and look at the statistical images from any view? Think of NeuroImage Digital Edition - how cool wouldn't it be to slice through the brain with your finger when you look at a specific figure?

We are almost there. Journals today allows you to send up extra digital material with your papers, such as movie clips. But it can be done better. If Wired can do an interactive magazine for iPad, so can NeuroImage and any other scientific journal.

I would really like to see something similar to NeuroPub in an eJournal. The idea would be that you just double tap on a figure that you want to interact with, for instance a 3D volume rendering of the brain. Then you immediately get into interactive mode and can rotate the brain in any direction in real time to see how the activation pattern looks like from another view.

Why stop there? Since the iPad has Internet capability, you could also let other readers comment on your paper, directly in the eJournal itself!

This is also a little bit how I perceive the future for NeuroPub. The first version will just be a simple viewer, but future versions could allow you to chat with other users through Twitter and even share your statistical images with others, perhaps with an interactive method description which points out some interesting locations in the brain that the author finds most interesting.

Tuesday 1 June 2010

Creating the poster

I finally have the image I need for the poster that I will present in Barcelona. It looks quite nice.


It took some time to get the crosshair to work properly, but now it's possible to select a coordinate in MNI space and see the location in both 2D and 3D.

Viewer is taking form

This is how the viewer will look like. The image list is on the left side, while the viewer is on the right. A typical iPad split view in other words. The viewer is divided into four subviews, which will contain a coronal, sagittal, axial, and a 3D view.

Only the 3D view is implemented so far, but this is the most complicated view on the other side, so I'm happy that I've finished that part. The iPad is really fast and there's no problem doing volume rendering in real time on the device. You just slide the finger on top of the view to rotate the brain.

Monday 31 May 2010

The name NeuroPub

What is the idea behind the name NeuroPub? It's a play with words. Just like PubMed refers to a database with medical publications, NeuroPub refers to a Neuroimaging viewer for your published results. Since iPad is such a great ebook reader, NeuroPub on iPad is like having your own interactive publications in the palm of your hand.

I have the 64 GB iPad. A 32-bit float image is about 2 MB in size (2x2x2 mm resolution). This means 1 GB can hold roughly 500 different statistical parametric maps and I could fill the iPad with 30000 images in whole! If I go up in resolution to 1x1x1 mm, this number obviously needs to be divided by 8, which still leaves us with an impressing number of 3750 images.

I doubt anyone will actually have so many statistical images, so the NeuroPub app will easily store all your current and future results which you can always carry with you on your iPad or iPhone at conferences, meetings, etc.

First version only for iPad

The first version of NeuroPub will only be available for iPad. This is because the iPhone version depends on iPhoneOS 4. However, I plan to release the iPhone version as soon as possible. It will be released as a universal app.

It's just a few days left before I go to Barcelona and I have to get the poster ready in good time. I need some good screenshots from the viewer, but I still have some work to do. Take a look at the screenshot below:




The idea is to have a 2x2 view, where the brain is visualised in axial, coronal, sagittal, and 3D. The viewer will come with the MNI standard brain in 2x2x2 resolution, so the requirement is that all the images that you want to view are in exactly the same field of view and resolution as the standard brain.

The images will be transferred through Apple's file sharing procedures, where you connect your iPad to iTunes and add your images to the NeuroPub app in iTunes. The viewer will only support the Nifti format, so any other image format (MINC, Analyze, etc) will have to be converted first.

More screenshots coming up soon. :)

Saturday 29 May 2010

NeuroPub - a Brain Visualisation Tool for iPad, iPhone and iPod touch

That is the name of my poster that will be presented at HBM 2010 (the 16th Annual Meeting of the Organization for Human Brain Mapping). The conference is in Barcelona this year, and it's only eight days left before the conference starts. My plan was to have the app released on the App Store in time for the conference, but I don't think it will get approved in time. It doesn't matter too much. It's more important that I have a great app that I can demonstrate on my iPad. I also need to prepare and print the poster, which I will present on Thursday June 10.

What is NeuroPub about? Statistical Parametric Maps for functional and structural images are often located in the same stereotaxic space - the ICBM152 space. This means results from different experiments can be visualised and compared on the same standard brain. The idea with NeuroPub is to create a tool which can be used as a library for all your statistical images - browse, search and visualise. With the advent of powerful mobile devices, such as the iPhone and the iPad, it's now possible to create this tool for iPhone OS - and store all your statistical images in your pocket! It will be a valuable tool whenever you want to discuss your results with other researchers, for instance at meetings such as the HBM conference.