The most ambitious of my current avenues of exploration involves looking for ways to reduce the burden of people having to ‘think like computers’ to program, instead pushing a programming environment incrementally towards more ambiguous, human, goal-oriented thinking.
In the context of this, I was pointed towards Wolfram|Alpha's work on a 'Natural Language Understanding System'.
The following is their description of the technology. They're deploying this as part of the Wolfram Language and specifically refer to programming as an intended function: 'Wolfram NLU lets you specify simple programs purely in natural language then translates them into precise Wolfram Language code.'
Complex linguistics, not statistics
Wolfram NLU is set up to handle complex lexical and grammatical structures, and translate them to precise symbolic forms, without resorting to imprecise meaning-independent statistical methods.
Learning from users
The high performance of today's Wolfram NLU has been achieved partly through analysis of billions of user queries in Wolfram|Alpha.
Wolfram NLU routinely combines outside information like a user's geolocation, or conversational context with its built-in Knowledgebase to achieve extremely high success rates in disambiguating queries.
Curating natural language
Wolfram NLU has a huge built-in lexical and grammatical Knowledgebase, derived from extensive human curation and corpus analysis, and sometimes informed by statistical studies of the content of the web.
Understanding raw human thoughts
Wolfram NLU is set up not only to take input from written and spoken sources, but also to handle the more "stream-of-consciousness" forms that people type into input fields.
To be explored in more depth, but certainly a reassurance that my line of inquiry is valid.