I get why people always say programming back in the day was way more difficult than our days now, not all of that is correct. Yes it was difficult to get your hands on documentations and even if you do, you probably have to go through a library of books, even if you managed to do that, it is difficult to get your hands on a terminal, you either had to pay for a time on a time-sharing machine or be part of some computing laboratory to be able to actually program something. Now if all this are managed, writing programs are not so complicated, in the sense that, the programs that are considered cutting edge back then are what we now our days call high school projects, and not much programming languages that you have to stress yourself on learning, only a few existed. No much development standards to adhere to, most of the programs are terminal based, so you don't have to worry about dealing with user interfaces or making your program fancy enough for your users (i.e users back then are mostly computer scientists and mathematicians who are well intelligible and care more about functions than design), and also there are less programs to be your competition, so anyway at all you write your programs, people will still use it.
Now talking about the modern day programming, it is a lot easier for a nine year old to start programming computers, there are lots of materials, documentations, video and even interactive tutorial at anyones disposal on the internet, programming languages are in variety, different types with different paradigms, languages that have been simplified and are much more readable by even someone who has little knowledge on programming. Besides that, there are lots of tools and lots of reuseable codes that someone can easily get his hands on, thanks to the Open Source Community. But having all this at ones disposal will not make that person a good programmer and will definitely not ensure the development of a cutting edge software, now it is fair enough to say that every software have been developed, of course that is not true but to come up with an idea that has never been implemented is nearly impossible. Even if one managed to come up with such an idea, one needs to learn the best and recent technologies used in his/her area of development, implementing to this days standards is way more difficult than it used to be, and trying to develop it in such a way that is sheds a light and shine above all the softwares out there will take months, if not years of sleepless nights of tweaking, debugging and re-coding (depending on the software size). Another point is, back in the day, computing technologies might last for 10-15 years, but this days, technologies change so fast and the field evolve more quickly, to the point that in the lifetime of a software (depending how big) it is possible to tune it to meet 3-4 different generations of technologies.
This analysis is a personal opinion and not meant to start a war between old school and modern developers, but simply to point out that simplicity of software development greatly depends on the perspective and what is put into consideration. Bitching about how hard it is, or stating all the if's will not change anything, so raise your head up, grab a pc and start hacking because before your are able to say "Jack Robinson" someone has put out that idea roaming in your head for quite sometime now and technology has evolved.