There are apps that help kids on the autistic spectrum to communicate, and flashcard systems, and we're experimenting with these, but they're more geared towards encouraging the child to communicate. In our case, he communicates fine with gestures, nudges, pointing at things he wants, bringing flash cards of foods he wants, eye contact etc. And he seems to have good cognitive skills in terms of puzzles, basic arithmetics and counting, memory, etc. It is learning language as an auditory system that seems to be really difficult.
Adam can 'read' in the sense of knowing and recognizing all the letters (he takes delight in that) and pronounce the few syllables he's able to when he sees them written out (mostly consonants m,n,h with vowels a,o,e). His phonematic understanding for other syllables exists but is poor (e.g. he has trouble choosing between a BAH card and a PAH card when I say one of them out loud, whereas the letters B/P in isolation are easy). My idea is to build an app/site which teaches him and reinforces three-way connections [picture]<-->[written form]<-->[sounds] by letting him "type", initially by pecking at large squares with letters on screen, rather than an entire keyboard. So for example, there's a picture of him at the top, a row of 4 big blank squares underneath the leftmost of which is blinking, and 7-8 letters strewn around at the bottom, from which he can type in sequence A-D-A-M and get a sound effect of victory. For words he doesn't know or remember, there's a mode where he just needs to repeat e.g. C-A-T which is already written in identical squares in a separate row just above, then after a few successes the hint row goes away. For an MVP in which I can quickly backfill 100-200 simple words like that, and track progress, this would already, I think, be valuable; then maybe I can add a mode where the words sounds (with or without the picture) and he needs to type it.
If all this works for simple words, and he takes pleasure in typing, the stretch goal is to turn from words into short sentences, and both teach him phrases like I WANT [X], or WHERE IS MOM?, and let him request stuff with such phrases. None of this directly addresses the apraxia problem of actually learning to move his lips/tongue/throat/etc. appropriately, but I hope it can create more scaffolding around our efforts in that area (which we try very hard to work on daily) and together help him build an understanding of language/syntax. I'm very worried that, despite ongoing (very slow) progress in both speaking and understanding, phrases, sentences, syntax seem to elude Adam's grasp, and time is running so very fast.
I've been a backend/systems developer almost all my life, with not a lot of frontend experience (although I do know basic HTML/CSS/JS), and no app development. So I'm thinking for now to prototype this as a web page/pages, maybe using a lightweight framework rather than vanilla HTML (not sure), and let him interact with it on the iPad. I'll try to get the basic visual elements (picture/rows of squares for typed letters/bag of letters to choose from below) right with CSS/JS, and see if I can iterate from that. That's the idea, currently.