For embedded IDA is very ergonomic still, but since it’s not abstract in the way Ghidra is, the decompiler only works on select platforms.
Ghidra’s architecture lends itself to really powerful automation tricks since you can basically step through the program from your plugin without having an actual debug target, no matter the architecture. With the rise of LLMs, this is a big edge for Ghidra as it’s more flexible and easier to hook into to build tools.
The overall Ghidra plugin programming story has been catching up; it’s always been more modular than IDA but in the past it was too Java oriented to be fun for most people, but the Python bindings are a lot better now. IDA scripting has been quite good for a long time so there’s a good corpus of plugins out there too.
(not if you're only doing x86/ARM stuff, though)
I was recently trying to analyse a 600mb exe (denuvo/similar). I wasted a week after ghidra crashed 30h+ in multiple times. A seperate project with a 300mb exe took about 5h, so there's some horrible scaling going on. So I tried out Ida for the first time, and it finished in less than an hour. Faced with having decomp vs not, I started learning how to use it.
So first difference, given the above, Ida is far far better at interrupting tasks/crash recovery. Every time ghidra crashed I was left with nothing, when Ida crashes you get a prompt to recover from autosave. Even if you don't crash, in general it feels like Ida will let you interrupt a task and still get partial results which you might even be able to pick back up from later, while ghidra just leaves you with nothing.
In terms of pure decomp quality, I don't really think either wins, decomp is always awkward, it's awkward in different ways for each. I prefer ghidra's, but that might just be because I've used it much longer. Ida does do better at suggesting function/variable names - if a variable is passed to a bunch of functions taking a GameManager*, it might automatically call it game_manager.
When defining types, I far prefer ida's approach of just letting me write C/C++. Ghidra's struct editor is awkward, and I've never worked out a good way of dealing with inheritance. For defining functions/args on the other hand, while Ida gives you a raw text box it just doesn't let you change some things? There I prefer the way ghidra does it, I especially like it showing what registers each arg is assigned to.
Another big difference I've noticed between the two is ghidra seems to operate on more of a push model, while Ida is more of a pull model - i.e. when you make a change, ghidra tends to hang for a second propagating it to everything referencing it, while Ida tries pulling the latest version when you look at the reference? I have no idea if this is how they actually work internally, it's just what it feels like. Ida's pull model is a lot more responsive on a large exe, however multiple times I've had some decomp not update after editing one of the functions it called.
Overall, I find Ida's probably slightly better. I'm not about to pay for Ida pro though, and I'm really uneasy about how it uploads all my executables to do decomp. While at the same time, ghidra is proper FOSS, and gives comparable results (for small executables). So I'll probably stick with ghidra where I can.
During the startup auto analysis? For large binaries it makes sense to dial back the number of analysis passes and only trigger them if you really need them, manually, one by one. You also get to save in between different passes.
I figured I probably could remove some passes, but being a lite user I don't really know/didn't want to spend the time learning how important each one is and how long they take. Ida's defaults were just better.
Ghidra is the better tool if you're dealing with exotic architectures, even ones that you need to implement support for yourself. That's because any architecture that you have a full SLEIGH definition for will get decompilation output for free. It might not be the best decompiler out there, sure, but for some architectures it's the only decompiler available.
Both are generally shit UX wise and take time to learn. I've mostly switched from IDA to Ghidra a while back which felt like pulling teeth. Now when I sometimes go back to IDA it feels like pulling teeth.
- AVR
- Z80
- HC08
- 8051
- Tricore
- Xtensa
- WebAssembly
- Apple/Samsung S5L87xx NAND controller command sequencer VLIW (custom SLEIGH)
And probably more that I've forgotten.It's also not about lack of support, but the fact that you have to pay extra for every single decompiler. This sucks if you're analyzing a wide variety of targets because of the kind of work you do.
IDA also struggles with disasm for Harvard architectures which tend to make up a bulk of what I analyze - it's all faked around synthetic relocations. Ghidra has native support for multiple address spaces.
Maybe we need to get some good cracked^Wcommunity releases of Binja so that we can all test it as thoroughly as IDA. The limited free version doesn't cut it unfortunately - if I can't test it on what I actually want to use it for, it's not a good test.
(also it doesn't have collaborative analysis in anything but the 'call us' enterprise plan)