I am not 100% sure what you mean, but it is definitely clear that the GEDCOM standard falls short on many fronts. Unfortunately, within FH we are forced to work within these constraints, and so any method to record (and reference and search) this data inherits these constraints. I used to work with Gramps a lot (and TMG before that), and Gramps is actually one of the programs I liked the most in its ability to record names (and also places). Gramps specifically allows not only the researcher to differentiate between things like birth names, married names, nick names, a.k.a.'s, etc. But also allowed you to specify name element origin such as Matronym, Patronym, Toponym, etc. And this is even in addition to allowing you to separate out name elements like titles, suffixes, prefixes, and a host of other things. (
Names in Gramps). Gramps is also similarly a project that has put an equal amount of thought into places.
Unfortunately, as mentioned, these are all things that fall outside the GEDCOM definition and so for our purposes we have to do with what we have. For me, recording every single name variance encountered in the sources has been the best way for me. For purposes of simpler and cleaner searching and reporting and such this has meant I have had to revert to "normalizing" certain names. No ideal, but it is what it is. Incidentally, I like FH much much better than Gramps overall, but I wish there was a way to better represent names and the various elements of them in a more robust manner.