I've been in software development since the late 90's. Used to do VB & SQL Server. Got stuck in a job that didn't progress my skills much for a a long time. Past few years I've been doing C#, ASP.NET, SQL Server developing web applications. I've done a lot of studying to get my skills up to date, aiming to become a "full-stack" developer. However, I've had three jobs (permie) in a row (including my current job) that I've really struggled in.
The technology is only part of the problem. Seems to me that software devlopment as a career is much harder now than it was in the late 90's. I can think of 3 reasons for that:
1) Modern software applications tend to use far more technology. In the old days it was just VB, SQL Server and maybe a middleware layer (DAO, RDO, ADO). Nowadays it's C#, SQL Server, ASP.NET, HTML, CSS, JavaScript, jQuery, Bootstrap, XML, Typescript, SASS, SignalR, Dot Net Core, Docker, Kubernetes, Microservices.
2) The applications themseleves are far more complex. They have lots more functionality, and users expect far more from them than in the past. For example, cross platform, high security, high availability, scallable, etc.
3) Iterative development methodologies (e.g. Agile) have led the business people and end users to have very high expectations from a development team. They expect rapid progress in delivering bug fixes, improvements, new functionality etc. There is a relentless tight (e.g. fortnightly) cycle of planning, recording, monitoring, reviewing progress through work tasks. Developers are expected to account for almost every hour of their working day on a timesheet; not just hours worked but allocating those hours to specific tasks so that management can see how long a developer spent on any one task. Feels like relentless pressure to deliver.
Even the source code control system seems more difficult nowadays. I was quite comfortable using SourceSafe and TFS. Felt like I understood what they were doing, at least from a user perspective. However, nowadays I'm using GIT, and even after 3 years I still don't understand it. Every time I need to merge or rebase I tend to have problems and lose faith in the process. It's just my opinion, but I find it very un-intuitive.
I'm getting a bit tired of struggling so much. Really expected it to have gotten a bit easier by now. I'm begining to wonder whether it might be over ambitious to aim for full stack. Then again, whenever I look for dev jobs in the language I'm familiar with (currently C#) they tend to always require the web skills as well as the language. I suppose there are other areas I could get into that are still within IT (e.g. testing, technical authoring, database administration). Not sure I fancy those though. I think part of my struggle is in understanding the business domain (not just the language). That would be necessary for a tester just as it is for a developer.
The type of dev work that interests me most is the middle-tier & back-end stuff (OOP, design patterns, parallel processing). Not sure there is much call for that though, without needing to also have the front-end skills.
Wondering if anyone else finds it tough going nowadays, and if anyone has any ideas/suggestions?
The technology is only part of the problem. Seems to me that software devlopment as a career is much harder now than it was in the late 90's. I can think of 3 reasons for that:
1) Modern software applications tend to use far more technology. In the old days it was just VB, SQL Server and maybe a middleware layer (DAO, RDO, ADO). Nowadays it's C#, SQL Server, ASP.NET, HTML, CSS, JavaScript, jQuery, Bootstrap, XML, Typescript, SASS, SignalR, Dot Net Core, Docker, Kubernetes, Microservices.
2) The applications themseleves are far more complex. They have lots more functionality, and users expect far more from them than in the past. For example, cross platform, high security, high availability, scallable, etc.
3) Iterative development methodologies (e.g. Agile) have led the business people and end users to have very high expectations from a development team. They expect rapid progress in delivering bug fixes, improvements, new functionality etc. There is a relentless tight (e.g. fortnightly) cycle of planning, recording, monitoring, reviewing progress through work tasks. Developers are expected to account for almost every hour of their working day on a timesheet; not just hours worked but allocating those hours to specific tasks so that management can see how long a developer spent on any one task. Feels like relentless pressure to deliver.
Even the source code control system seems more difficult nowadays. I was quite comfortable using SourceSafe and TFS. Felt like I understood what they were doing, at least from a user perspective. However, nowadays I'm using GIT, and even after 3 years I still don't understand it. Every time I need to merge or rebase I tend to have problems and lose faith in the process. It's just my opinion, but I find it very un-intuitive.
I'm getting a bit tired of struggling so much. Really expected it to have gotten a bit easier by now. I'm begining to wonder whether it might be over ambitious to aim for full stack. Then again, whenever I look for dev jobs in the language I'm familiar with (currently C#) they tend to always require the web skills as well as the language. I suppose there are other areas I could get into that are still within IT (e.g. testing, technical authoring, database administration). Not sure I fancy those though. I think part of my struggle is in understanding the business domain (not just the language). That would be necessary for a tester just as it is for a developer.
The type of dev work that interests me most is the middle-tier & back-end stuff (OOP, design patterns, parallel processing). Not sure there is much call for that though, without needing to also have the front-end skills.
Wondering if anyone else finds it tough going nowadays, and if anyone has any ideas/suggestions?
Comment