It is mutually assured destruction. The job seeker AI spams out a resume to every listing and the hiring AI rejects all applicants for not meeting some unknown criteria. In the end, no worker can find a job and no employer can get applicants. Companies go back to only hiring friends and families of existing employees.
CodeMonkey
I am used to seeing ring buffers implemented using an array. They are FIFO if you write to the maximum offset and read from the minimum offset but they are double ended if you have a method to read from the maximum offset and write to the minimum offset.
Why would you use a library or framework when you can code everything from scratch? It probably depends on how good the VSCode extension is vs how bad the IDE is.
For the languages I have tried (mostly GoLang plus a bit of Terraform/Terragrunt), VSCode plugins can do code highlighting, can highlight syntax and lint errors, can navigate to a methods implementation, the auto-complete seems to pick random words from the code base, and can find the callers for a method. It is good enough for every day use.
IDEs I have used (Eclipse for Java, PyCharm, InteliJ for Kotlin) offer more. They all have starter templates for common file types. The auto-complete is much more syntax aware and can sometimes guess what variables I intend to pass in as arguments. There is refactoring which can correctly find other usages of a variable and can make trivial code rewrites. There are generators for boilerplate methods. They all have a built in graphical debugger and a test runner.
I was a bit surprised that deque
is implemented as a linked list and not, for example, a ring buffer. It would mean that index reads would be constant time (though insert and delete at an index would be linear time), the opposite of using a linked list.
TAOCS has a reputation for being very deep and thorough, not for being a good introductory text. One of my professors said that in his (very long) industrial career, he only met one person who actually read the books beginning to end but everyone looks something up in them once or twice.
That has been my experience. I once needed to find out how to solve a very specific problem (I think it was calculating statistical values on an infinite stream). I found the single copy of TAOCS in the office reference library, read the relevant section, and implemented the suggested algorithm.
Maybe it is just my experience, but in the last decade, employers stopped trying to recruit and retain top developers.
I have been a full time software engineer for more than a decade. In the 2010s, the mindset at tech giants seemed to be that they had to hire the best developers and do everything they could to keep them. The easiest way to do both was to be the best employer around. For example, Google had 20% time, many companies offered paid sabbaticals after so many years, and every office had catering once a week (if not a free cafeteria). That way, employees would be telling all of their friends how great it is to work for you and if they decide to look for other work, they would have to give up their cushy benefits.
Then, a few years before the pandemic, my employer switched to a different health insurance company and got the expected wave of complaints (the price of this drug went up, my doctor is not covered). HR responded with "our benefits package is above industry averages". That is a refrain I have been hearing since, even after switching employers. The company is not trying to be the best employer that everyone wants to work at, they just want to be above average. They are saying "go ahead and look for another employer, but they are probably going to be just as bad".
Obviously, this is just my view, so it is very possible that I have just been unlucky with my employers.
pip
is a perfectly usable package manager and is included in most python distributions now. Is it perfect? No, but it is good enough for every team I have been on.
I had a worse experience. My first internship was doing web development in ColdFusion. Why that language? Because when the company was first starting, none of the funders wanted to learn Linux/Apache administration and CF ran on Windows.
Also, the front end development team did not have version control but shared code via a file server.
C does exactly what you tell it, no more. Why waste cycles setting a variable to a zero state when a correct program will set it to whatever initial state it expects? It is not user friendly, but it is performant.
Are we really doing fine? 4% linux market share? Windows is a default?
I suspect that the issue hindering adoption is GNU and other user land projects, not the Linux kernel. Plenty of people use devices that pair a Linux kernel with an easy to use UI and popular software (see Android and Chromebook).
Many people would happily switch to a Linux based OS that had the exact same GUI as their current OS and ran the exact same software. That is not a realistic requirement in practice.
It is possible that Linux would have more adoption if they invested more money into having drivers for a wider range of hardware, but having Linux kernel develers write drivers instead of hardware vendors is not a strategy that scales well.
From a quick look at the repo, it is end-to-end testing for web applications.
Also, it seems that their big selling point is a verbose, English like syntax.