←back to thread

52 points xy2_ | 2 comments | | HN request time: 0.431s | source
1. jcul ◴[] No.42071808[source]
Great write up!

I looked at ImHex a good while back and I think I had some runtime issues or maybe even compilation issues and didn't dig deeper. Even though the definition language piqued my curiosity.

These days I tend to just use xxd, bless, ghex, or seldom wxHexEditor, depending on what I need. But ImHex looks really powerful, like it could replace all the GUI ones. I'm looking forward to giving it another go tomorrow.

Though these days I spend most of my time in wireshark, which is kind of a hex viewer in a way.

How does it manage with huge files? Does it try to load the entire thing into memory. I remember wxHexEditor being good for that, and even being able to open block devices directly and process memory IIRC. Might be getting mixed up with HxD.

The decompression and combining compressed with decompressed sections looks very cool. Is the decompression in memory or written to disk?

// TagRecord Tags[while(!std::mem::eof())];

This loop based length stuff is very cool too, though for large files I'd imagine it could be slow as it will need to iterate through all records to determine the offset for records at the end of the file.

To be fair, wireshark / pcap files have this problem too.

replies(1): >>42072609 #
2. viraptor ◴[] No.42072609[source]
> though for large files I'd imagine it could be slow as it will need to iterate through all records to determine the offset for records at the end of the file.

Yeah, it's not doing lazy evaluation, so you need to watch out. It's probably not the solution you want for (for example) looking at 500GB disk images.