• 0 Posts
  • 30 Comments
Joined 1 month ago
cake
Cake day: December 6th, 2024

help-circle
  • Just adding to it from the other side (ish) of it.

    The point being that what you describe is a broader phenomenon and that, at least amongst Techies, taking in account the point of view of the people on the other side and chosing objective-oriented language with minimal or no social niceties if you figure out they’re constrained in the time they have for handling messages like the one you’re sending, is something one learns rather than coming naturally.

    Same kind of thing applies, for example, when applying to certain jobs: in your cover letter of even CV you put all the stuff they care about for baseline selection upfront and the kind of stuff that matters “if they’re interested” comes afterwards so that if it’s clearly not a fit people’s time doesn’t get wasted. It’s nice for the people on the other side and, as somebody who has been on the other side, this is appreciated and shows professionalism which will help the candidate out if they do seem interesting from reading that baseline selection info.

    Not the same thing as your specific situation but same pattern, IMHO.



  • It eliminates the dependency of specific distributions problem and, maybe more importantly, it solves the dependency of specific distribution versions problem (i.e. working fine now but might not work at all later in the very same distribution because some libraries are missing or default configuration is different).

    For example, one of the games I have in my GOG library is over 10 years old and has a native Linux binary, which won’t work in a modern Debian-based distro by default because some of the libraries it requires aren’t installed (meanwhile, the Windows binary will work just fine with Wine). It would be kinda deluded to expect the devs would keep on updating the Linux native distro (or even the Windows one) for over a decade, whilst if it had been released as a Docker app, that would not be a problem.

    So yeah, stuff like Docker does have a reasonable justification when it comes to isolating from some external dependencies which the application devs have no control over, especially when it comes to future-proofing your app: the Docker API itself needs to remain backwards compatible, but there is no requirement that the Linux distros are backwards compatible (something which would be much harder to guarantee).

    Mind you, Docker and similar is a bit of a hack to solve a systemic (cultural even) problem in software development which is that devs don’t really do proper dependency management and just throw in everything and the kitchen sink in terms of external libraries (which then depend on external libraries which in turn depend on more external libraries) into the simplest of apps, but that’s a broader software development culture problem and most of present day developers only ever learned the “find some library that does what you need and add it to the list of dependencies of your build tool” way of programming.

    I would love it if we solved what’s essentially the core Technical Architecture problem of in present day software development practices, but I have no idea how we can do so, hence the “hack” of things like Docker of pretty much including the whole runtime environment (funnilly enough, a variant of the old way of having your apps build statically with every dependency) to work around it.



  • At some point in my career I worked in Investment Banking making custom software directly for people like Traders (so in the are of IT in that industry that’s called the Front Office)

    Traders have almost no free time, hence no time for social niceties, plus they’re “the business” which is the reason for Front Office IT to exist and for whom it works, so eventually you just have to figure out their point of view and that the only way you can do the part of your work that requires interacting with them (to figure out what they need or letting them know what’s now available for them to use) is to use straightforward objective-oriented talks like that.

    It was actually quite a learning experience for me as a techie to learn how to interact with time constrained people who aren’t going to change to suit you, in a way that best does what’s needed for both.



  • Whilst I agree with you in everthing but the first 2 words of your post, I think this is yet another “look at this cool gadget” post that overhypes something, and that is a kind of spam we get a bit of around here, even if nowhere near the levels of the Elon crap or even just US politics.

    This is especially frustratingfor people who, like me, looked at the diagram they link from their article and found out it’s pretty much the same as a run of the mill breadboard power adaptor with a USB-C connector and a slightly better design than the cheap chinese ones, rather than something trully supporting USB-PD (this thing doesn’t even support the basic USB 1.0 negotiation needed to get more than 150mA when connecting to a proper USB host).

    That the article then mentions a “crowdfunding campaign” for something that a junior EE can design with a bit of datasheet digging, carries a bit of a stink of a cash-grab, so seeing it as spam is understandable.


  • If you look at the circuit diagram in their documentation linked from that article, that thing doesn’t even support USB-PD or even just the USB 1.0 device side of the negotiation to increase the current limit from the default (150mA in USB 3) to high (900mA in USB 3). It will look like it works fine if you connect it to a dumb USB power supply (because those thing don’t really do any USB protocol stuff, just dumbly supply power over USB connectors up to the power source’s limit) but if you connect it to, say, a PC USB port (which does implement the USB host side of the USB protocol), your circuit on the breadboard that worked fine when using a dumb USB power supply with that breadboard adaptor might not work because the current it needs exceeds that default 150mA limit for devices that haven’t done USB negotiation (worse if it’s a USB 2.0 port, as the limit is lower for those)

    This thing is basically the same as the chinese power breadboard adaptors you can get in places like Aliexpress, but with a USB-C connector instead of a Type-A, micro-USB or mini-USB one, plus its better designed (it has a proper Buck Converter instead of a cheap Votage Regulator, plus better power supply filtering and a polyfuse to protect it and the host from current overdraws).

    The headline and the article seriously exagerate this “achievement”.


  • TL;DR - It’s a nice and pretty run of the mill breadboard power adaptor which happens to support USB-C connectors, but the article and its title insanely oversell the thing.

    This is not exact as amazing an achievement as the headline implies since the necessary stuff to talk the to the USB PD host upstream is already integrated so you just need to get a chip that does it (and even without it, you’ll get 150mA @ 5V by default out of the USB 3 host upstream and up to 900mA with some pretty basic USB negotiation in a protocol that dates from USB 1.0 and for which there have long been integrated solutions for both the device and the host sides).

    Further, the converting of those 5V to 3.3V just requires a buck converter or even just a voltage regulator (though this latter option is less efficient), for which there are already lots of integrated solutions available for peanuts and where the entire circuit block needed to support them is detailed in the datasheet for that converter.

    Looking at the circuit diagram for this (linked to from the article), they’re not even doing the USB PD negotiation or any kind of USB 1.0 negotiation, so this thing will be limited to 150mA for a USB 3 host or whatever current your traditional USB power source can supply (as those power sources really just do power supply of whatever amperage they support over a cable which happen to have USB connectors, rather than including a genuine implementation of an USB host with current limiting depending on negotiation with the USB device, so such power sources don’t require the device to do any USB negotiation to increase the current limit above 150mA).

    This is really “yet another run of the mill USB power breadboard adaptor” only the USB plug is USB-C rather than mini-USB or micro-USB (so, a different plug plus a handfull of minor components as per the standard of the circuitry to properly support it), so pretty much the same as the cheap chinese ones you can get from Aliexpress, though this one uses a Buck Converter rather than the $0.1 Voltage Regulator in most of the chinese boards, and actually does proper filtering of power supply noise and proper protection against over current, so it is a quality design for such things, though it’s not really a major advancement.

    Without the USB PD stuff I wouldn’t really say that it brings USB-C Power to the breadboard (in the sense of, as many would expect, being able to draw a proper amount of power from a modern USB 3.0 power brick that supports USB-C), more something with a USB-C connector that brings power to the motherboard, as that connector is really the total sum of what it supports from the modern USB spec.

    What would really be nice would be something that does talk USB-PD to the upstream host AND can convert down from the 20V at which it supplies peak power, so that you can take advantage of the juicy, juicy (oh so juicy!) capability of USB-PD to supply power (up to 100W right now, which will be up to 250W with USB 4), though if you’re pulling 100W (which at 5V means 20A, which is a stupidly high current that will melt most components in a typical digital circuit) from you breadboard power adaptor, then I’m pretty sure magic smoke is being released from at least one of the components on that breadboard and, by the way, you’re probably damaging the power rail of that breadboard (aah, the sweet smell of burnt plastic when you turn the power on for your half-arsed experimental circuit!!!)


  • I have a cheap N100 mini-PC with Lubuntu on it with Kodi alongside a wireless remote as my TV box, and use my TV as a dumb screen.

    Mind you, you can do it even more easily with LibreELEC instead of Lubuntu and more cheaply with one of its supported cheap SBCs plus a box instead of a mini PC.

    That said, even the simplest solution is beyond the ability of most people to set up, and once you go up to the next level of easiness to setup - a dedicated Android TV Box - you’re hit with enshittification (at the very least preconfigured apps like Netflix with matching buttons in your remote) even if you avoid big brands.

    Things are really bad nowadays unless you’re a well informed tech expert with the patience to dive into those things when you’re home.



  • Not living in the US, I’m not up to date with US salaries.

    That said, even for administrative personnel paid $25/h, $25 will pay 1h of somebody’s work which is way beyond what is needed to close a retail customer account in any modern administrative system were such thing is a common operation which should take less than a minute to do, because people who design the kind of company administrative computer systems (such as yours truly, at least during part of my career) will make the most common business operations be the fastest to do in that system.


  • It deceives people whose idea of how things work in large companies hasn’t changed since the days when it was the manager of your bank branch who decided if you you should get a loan or not.

    Nowadays, for certain in middle and large size companies, all the administrative main business pathways are heavilly if not totally automated and it’s customer support that ends up eating the most manpower (which is why there has been so much of a push for automated phone and chat support systems, of late using AI).

    Those $25 bucks for “account closure” pays at worst for a few minutes of somebody’s seeking the account from user information on a computer, cross checking that the user information matches and then clicking a button that says “Close accout” and then “Ok” on the confirmation box and the remaining 99% or so left after paying for that cost are pure profit.


  • As somebody who works in designing software systems, including for large companies, lets just say that the amount of human time that goes into a customer account closure is negligible because main business operations such as openning and closing customer accounts are the ones that get automated the soonest and the furthest.

    The stuff that uses “lots” (in relative terms) of manpower is supporting customers with really unusual problems involving third parties and even then spending 2.5 h man/hours (assuming the administrative person get paid $10/per hour) is pretty uncommon.

    You’ve been lied to, repeatadly, for at least 3 decades.


  • Perceived value”

    Without that element, there would be no explanation for Marketing other than pure Brand Awareness promotion working (and McDonalds is definitely beyond needing more Brand Awareness, at least in the Developed World)

    Even then, it doesn’t explain a lot of how Marketing does its work (namelly the stuff they took from Psychology and use to do things like create associations between brand and specific feelings on people’s subconscious - you know, the way cars are “freedom” and perfumes are “sex”).

    And don’t get me started on other techniques that prey of human cognitive weaknesses (for example, FOMO would not work with the fabled Homo Economicus that underpins so much of Free Market Theory)

    Anyways, a ton of present day enshittification (and that includes this kind of price inflation) relies on people having a well entrenched positive perception of a brand after years of having a relationship with it (i.e. chosing it as customers) and there being quite a lot of momentum behind it. It also relies a lot on using a “slow boiling” effect to keep people from spotting the full picture of the changes.


  • The problem is that what people need in the environment we live in (i.e. Capitalism) is monetisable skills, whilst often College Degrees, whilst teaching people things that are very hard, if not impossible, to learn elsewhere, do not in fact provide people with monetisable skills. A good example would be most Arts Degrees. I was lucky that my natural inclination was towards Science and Engineering and that I had a knack for Programming, but had I gone done the other direction that I had a bit of a knack for - Acting - my life would’ve been totally different (judging by some acquaintances of mine from that world, it would’ve been a way way harder life in the financial sense).

    As for the hiring people from third world countries, in the case of India (if that’s one of the ones you mean), having had several colleagues from there and from talking to them it seems that whilst indeed most people don’t even have basic computer literacy (I’m not even sure if being able to read and write is something that a majority of people can do there), there are people that do have access to the same stuff as in Developed Countries (at worst they just pirate it) and even though they’re a small fraction of all people, in a country with so many people it still adds to a larger number. Companies abroad aren’t hiring the poor countryside illiterate people who can’t even speak English (I believe most people in India can’t), they’re hiring the Middle and Upper Middle class from over there and given the massive, massive inequality there, those did have access to modern computers and software.

    Same thing would apply to places like South America - lots of poor people who are totally computer illiterate (often just plain illiterate in the general sense) but a minority did have access to all the same things as in Developed countries - most having maybe not as powerful computers and using mostly pirated software, but still the same stuff.

    That said, I totally agree that college degrees shouldn’t be required for many positions they are required for nowadays. The degrees there aren’t really required because they teach things needed or even useful for those positions, they’re required because there’s an imbalance of offer vs demand for those jobs (too many candidates, too few jobs) so those hiring just put that requirement there because “we lose nothing from doing it and, who knows, maybe the degree will come in handy at some point” plus they’re kind of an easy way to thin the applications.

    If the job market was tighter demanding degrees for jobs not requiring them would stop. And, yeah, at least in certain areas the mainstream parties helping out business interests by giving away work visas like confetti is the reason why the job market is not as tight in many areas as it would naturally be.


  • I can’t really speak for other areas, but at least in Science and Engineering what a College does that you don’t get “in the field” (because it doesn’t directly lead to operational results so you don’t get the time to learn it) is the foundations for the work you do, so the Mathematics and the “why” certain things work as they work rather than merely the “how” to do it (or at least it did back when I got my degree 3 decades ago).

    Mind you, in my experience your “9 out 10 times” point is probably right at least in what I do - Software Development - that kind of knowledge is only useful in a fraction of the work for a fraction of the people, generally the kind of developer who is a “tool maker” rather than just a “tool user” working in things which are stretching the envelope of what can be done and are innovative or unusual technically (so, not “innovative” business models, which are what most of Tech Startups do nowadays), which might actually be an even worse ratio than 1 in 10 for those people vs the general universe in even just Software Development (much less Tech in general).

    I supposed that what I’m trying to say is that at least in Tech you’re not going to be all that great at doing work which extends the boundaries of what is possible without the kind of foundations a good Science or Engineering degree will give you - hence there is value in such education - but the vast majority of people in even the supposedly expert positions in Tech aren’t extending the boundaries of what is possible, not even close.

    (In other words, I’m expanding on what you said rather than disagreeing with it)



  • Any other good in comparison

    Arguing good option bad…

    The second line doesn’t logically follow from the first - you’re talking about a relatively better option all the way to that top line and then you switch from “better than other” to “good” - it’s like going about how in a choice between being knifed twice versus being knifed just once the “just knifed once” is good in comparison and then jumping from that to saying that getting knifed once is good.

    Even beyond that totally illogical jump, the other flaw of logic is treating each election as a unique totally independent choice whose results have no impact on the options available on subsequent choices - I.e. that who the Democrat Party puts forwards and who the Republic Party puts forwards as candidates in an election isn’t at all influenced by how the electorate responded to previous candidates they put forward in previous elections - it is absolutely valid for people to refuse to vote for Kamala to “send a message to the Democrat Party” (I.e. to try to influence the candidates the party puts forward in subsequence election) and it’s around the validity or not of risking 4 years of Trump to try and get an acceptable Democrat candidate in at the end of it that the discussion should be (and there are valid points both ways) not the hyper-reductive falacy you seem so wedded to.

    Choices in the real world are a bit more multi faceted and with much more elements and implications than that self-serving “simpleton” slogan the DNC pushed out in its propaganda which you are parroting.


  • I use a pretty basic one (with an N100 microprocessor and intel integrated graphics) as a TV box + home server combo and its excellent for that.

    It’s totally unsuitable for gaming unless we’re talking about stuff running in DOSEmu or similar and even then I’m using it with a wireless remote rather than a keyboard + mouse, which isn’t exactly suitable for PC gaming.

    Mind you, there are configurations with dedicated graphics but they’re about 4x the price of the one I got (which cost me about €120) and at that point you’re starting to enter into the same domain as small form factor desktop PCs using things like standard motherboards, which are probably better for PC gaming simply because you can upgrade just about anything in those whilst hardware upgradeability of mini PCs is limited to only some things (like SDD and RAM).