Wilbur doesn't impose any particular limits on image size except in one dialog (I can't remember which one, but it's not one having to do with image size). Its technique for signaling that it's out of memory is a crash.

The program uses a minimum of about 9 bytes for each sample (more or less; I'm getting old and my CRS is flaring up more and more), meaning that the 28k by 14k image will take a minimum of about 4GB of memory to open. That should be well within a modern computer's grasp. Wilbur composes its own images for the graphics card from the main image, so it's unlikely to overflow the graphics card, either. Yes, it's pretty slow in a lot of ways by today's standards, but it's 25 years old now...

If you create a blank image that's 28k by 14k, does Wilbur handle that? Creating a blank image will test if the program is running out of memory.

When you say that it fails to open the image, is it giving you a blank surface the size of your data?

Whether or not you can open a particular image could be for any number of reasons. If the program isn't crashing, then there was probably something in the image that Wilbur didn't like. What kind of image were you trying to open?

30 seconds to attempt to load and process that large of an image isn't bad at all. It could be that Wilbur is misinterpreting byte order or that there is one really high sample in a corner that causes all of the rest of the image to be made invisible. Look at the image histogram (Window>Histogram) and see if the range is anything other than 0. If so, there's something in the data that Wilbur doesn't like and using height clip on the data might get rid of spurious data.