Aceticon

joined 2 months ago
[–] Aceticon@lemmy.dbzer0.com 2 points 3 weeks ago* (last edited 3 weeks ago) (9 children)

There's two different ways to read the previous poster's point:

  • That any kind of quotas system (no mater whose "born with certain genetic traits" group it favours) is generally bad and causes more problem than it solves. From what I've observed in my one and only time working in a place with such quotas, that's what I saw, with both very incompetent people from the favored group who clearly only got the job due to quotas and at the same time with competent members of that group having trouble being taken seriously because they were assumed to be incompetent and having only got the position due to having the genetics that made them be a member of said favored group (they were de facto seen as second class), so in general I would agree that priviledging in hiring anybody due to the genetics they were born with is wrong (not to be confused with systems that try and make sure nobody is discriminated against due to the genetics they were born with, systems I totally agree with: basically I disagree with people being given different treatment when it comes to selection for a professinal occupation due their genetics).
  • That women and non-straight men are a problem in that profession. If that's the take, I not only totally disagree with it but find it apalling and unnacceptable. Again, experience tells me that in IT women and non-straight men are neither less nor more competent than straight men: from what I've observed gender and sexual orientation are, as expected, entirelly irrelevant when it comes to professional competent in that domain. One needs to have no clue whatsoever about that domain and be an abnormal simpleton to think gender or sexual orientation is what makes somebody a good or bad professional in any of the various areas of the Industry.
[–] Aceticon@lemmy.dbzer0.com 2 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

A family of software development processes for teams, which focuses on cycles of quickly building and delivering smaller blocks of program functionally (often just a single program feature - say: "search customers by last name" - or just part of a feature) to end-users so as to get quick feedback from those users of the software, which is then is use to determining what should be done for subsequent cycles.

When done properly it addresses the issues of older software development processes (such as the Waterfall process) in siuations where the users don't really have a detailed vision of what the software needs to do for them (which are the most usual situations unless the software just helps automates their present way of doing things) or there are frequent changes of what they need the software to do for them (i.e. they already use the software but frequently need new software features or tweaks to existing features).

In my own career of over two decades I only ever seen it done properly maybe once or twice. The problem is that "doing Agile" became fashionable at a certain point maybe a decade ago and pretty much a requirement to have in one's CV as a programmer, so you end up with lots of teams mindlessly "doing Agile" by doing some of the practices from Agile (say, the stand up meeting or paired programming) without including other practices and elements of the process (and adjusting them for their local situation) thus not achieving what that process is meant to achieve - essentially they don't really understand it as a software development process which is more adequate for some situations and less for others and what it actually is supposed to achieve and how.

(The most frequent things not being done are those around participation of the end-users of the software in evaluating what has been done in the last cycle, determining new features and feature tweaks for the next cycle and prioritizing them. The funny bit is that these are core parts of making Agile deliver its greatest benefits as a software development process, so basically most teams aren't doing the part of Agile that actually makes it deliver superior results to most other methods).

It doesn't help that to really and fully get the purpose of Agile and how it achieves it, you generally need to be at the level of experience at which you're looking at the actual process of making software (the kind of people with at least a decade of experience and titles like Software Architect) which, given how ageist a lot of the Industry is are pretty rare, so Agile is usually being done by "kids" in a monkey-sees-monkey-does way without understanding it as a process, hence why it, unsurprising, has by now gotten a bit of a bad name (as with everything, the right tool should be used for the right job).

[–] Aceticon@lemmy.dbzer0.com 3 points 3 weeks ago* (last edited 3 weeks ago)

They're supposed to work as an adaptor/buffer/filter between the technical side and the non-technical stakeholders (customers, middle/upper management) and doing some level of organising.

In my 2 and a half decades of experience (a lot of it as a freelancer, so I worked in a lot of companies of all sizes in a couple of countries), most aren't at all good at it, and very few are very good at it.

Some are so bad that they actually amplify uncertainty and disorganisation by, every time they talk to a customer or higher up, totally changing the team's direction and priorities.

Mind you, all positions have good professionals and bad professionals, the problem with project management is that a bad professional can screw a lot of work of a lot of people, whilst the damage done by, for example, a single bad programmer, tends to be much more contained and generally mainly impacts the programer him or herself (so that person is very much incentivised to improve).

[–] Aceticon@lemmy.dbzer0.com 11 points 3 weeks ago* (last edited 3 weeks ago)

Half way into saving the World it turns out you need some data that's not even being collected, something that nobody had figured out because nobody analysed the problem properly beforehand, and now you have to take a totally different approach because that can't be done in time.

Also the version of a library being include by some dependency of some library you included to do something stupidly simple is different from the version of the same library being included by some dependency of a totally different library somebody else includeed to do something else that's just as stupidly simple and neither you nor that somebody else want to be the one to rewrite their part of the code.

[–] Aceticon@lemmy.dbzer0.com 8 points 3 weeks ago* (last edited 3 weeks ago)

It's funny that without actually spotting the mistake I just read it as the op expecting multiple criticisms to and basically telling people bringing them to get in line.

[–] Aceticon@lemmy.dbzer0.com 4 points 4 weeks ago* (last edited 4 weeks ago)

Just adding to it from the other side (ish) of it.

The point being that what you describe is a broader phenomenon and that, at least amongst Techies, taking in account the point of view of the people on the other side and chosing objective-oriented language with minimal or no social niceties if you figure out they're constrained in the time they have for handling messages like the one you're sending, is something one learns rather than coming naturally.

Same kind of thing applies, for example, when applying to certain jobs: in your cover letter of even CV you put all the stuff they care about for baseline selection upfront and the kind of stuff that matters "if they're interested" comes afterwards so that if it's clearly not a fit people's time doesn't get wasted. It's nice for the people on the other side and, as somebody who has been on the other side, this is appreciated and shows professionalism which will help the candidate out if they do seem interesting from reading that baseline selection info.

Not the same thing as your specific situation but same pattern, IMHO.

[–] Aceticon@lemmy.dbzer0.com 1 points 4 weeks ago (3 children)

The first sign that the company you just joined is amateur hour, every hour of the day, every day of the year is that they don't have a Staging environment.

[–] Aceticon@lemmy.dbzer0.com 4 points 4 weeks ago* (last edited 4 weeks ago)

It eliminates the dependency of specific distributions problem and, maybe more importantly, it solves the dependency of specific distribution versions problem (i.e. working fine now but might not work at all later in the very same distribution because some libraries are missing or default configuration is different).

For example, one of the games I have in my GOG library is over 10 years old and has a native Linux binary, which won't work in a modern Debian-based distro by default because some of the libraries it requires aren't installed (meanwhile, the Windows binary will work just fine with Wine). It would be kinda deluded to expect the devs would keep on updating the Linux native distro (or even the Windows one) for over a decade, whilst if it had been released as a Docker app, that would not be a problem.

So yeah, stuff like Docker does have a reasonable justification when it comes to isolating from some external dependencies which the application devs have no control over, especially when it comes to future-proofing your app: the Docker API itself needs to remain backwards compatible, but there is no requirement that the Linux distros are backwards compatible (something which would be much harder to guarantee).

Mind you, Docker and similar is a bit of a hack to solve a systemic (cultural even) problem in software development which is that devs don't really do proper dependency management and just throw in everything and the kitchen sink in terms of external libraries (which then depend on external libraries which in turn depend on more external libraries) into the simplest of apps, but that's a broader software development culture problem and most of present day developers only ever learned the "find some library that does what you need and add it to the list of dependencies of your build tool" way of programming.

I would love it if we solved what's essentially the core Technical Architecture problem of in present day software development practices, but I have no idea how we can do so, hence the "hack" of things like Docker of pretty much including the whole runtime environment (funnilly enough, a variant of the old way of having your apps build statically with every dependency) to work around it.

[–] Aceticon@lemmy.dbzer0.com 1 points 4 weeks ago

Well, mucking about with configuration on a computer is a form of entertainment hence its "use" in a broader sense...

[–] Aceticon@lemmy.dbzer0.com 1 points 4 weeks ago

Look for a processor for the same socket that supports more RAM and make sure the Motherboard can handle it - maybe you're lucky and it's not a limit of that architecture.

If that won't work, breakup your self-hosting needs into multiple machines and add another second hand or cheap machine to the pile.

I've worked in designing computer systems to handle tons of data and requests and often the only reasonable solution is to break up the load and throw more machines at it (for example, when serving millions of requests on a website, just put a load balancer in front of it that assigns user sessions and associated requests to multiple machines, so the load balancer pretty much just routes request by user session whilst the heavy processing stuff is done by multiple machines in such a way the you can just expand the whole thing by adding more machines).

In a self-hosting scenario I suspect you'll have a lot of margin for expansion by splitting services into multiple hosts and using stuff like network shared drives in the background for shared data, before you have to fully upgrade a host machine because you hit that architecture's maximum memory.

Granted, if a single service whose load can't be broken down so that you can run it as a cluster, needs more memory than you can put in any of your machines, then you're stuck having to get a new machine, but even then by splitting services you can get a machine with a newer architecture that can handle more memory but is still cheap (such as a cheap mini-PC) and just move that memory-heavy service to it whilst leaving CPU intensive services in the old but more powerful machine.

[–] Aceticon@lemmy.dbzer0.com 4 points 4 weeks ago (3 children)

At some point in my career I worked in Investment Banking making custom software directly for people like Traders (so in the are of IT in that industry that's called the Front Office)

Traders have almost no free time, hence no time for social niceties, plus they're "the business" which is the reason for Front Office IT to exist and for whom it works, so eventually you just have to figure out their point of view and that the only way you can do the part of your work that requires interacting with them (to figure out what they need or letting them know what's now available for them to use) is to use straightforward objective-oriented talks like that.

It was actually quite a learning experience for me as a techie to learn how to interact with time constrained people who aren't going to change to suit you, in a way that best does what's needed for both.

[–] Aceticon@lemmy.dbzer0.com 5 points 4 weeks ago (1 children)

... is it me you're looking for?

view more: ‹ prev next ›