Tha's 'cause we seen it a'fore an' unnerstan' contex' clooooz, not 'cause izuh ac'rate phraaaze. (<- approximately my hick-ass "white" accent & dialect if I let myself slip, or have been hanging out around family too much)
Meanwhile, most of us white folk have to work to "sound white", too, when we want/need to, because our usual accent doesn't "sound white" in the way that's meant. In plenty of cases this is quite far from our ordinary, or at least childhood (some of us all but obliterate it by adulthood, on purpose) accent. It's a poor term in this kind of context, and better ones exist.
"General American is thus sometimes associated with the speech of North American radio and television announcers, promoted as prestigious in their industry,[45][46] where it is sometimes called "Broadcast English"[47] "Network English",[4][48][49][50] or "Network Standard".[2][49][51] Instructional classes in the United States that promise "accent reduction", "accent modification", or "accent neutralization" usually attempt to teach General American patterns.[citation needed] Television journalist Linda Ellerbee states that "in television you are not supposed to sound like you're from anywhere",[52] and political comedian Stephen Colbert says he consciously avoided developing a Southern American accent in response to media portrayals of Southerners as stupid and uneducated.[45][46]"
(Wikipedia, "General American English")
That's what's intended. Not "white". They surely aren't trying to make them sound like most of the American white people I know who haven't deliberately trained away their natural accent & dialect—and I don't even live in the South!
Holy shit, y'all weren't kidding. That Algolia task would be many thousands of dollars of work at agencies I've worked at, and I'd expect $1-2k even bidding as a freelancer (yes, just for a mockup of this size and complexity, not prod-ready). That's so far beyond reasonable, it can no longer even see its border.
Also, WTF is this about?:
> Important: Do not fork this repository to create your assignment. Doing so will wake a bot that prints out your code, immediately sends it to the shredder, and archives your application in our applicant tracking system. And anyway we’d rather give everyone an equal shot to show us what they can do.
They want you to host it on GH Pages later on in the instructions, so what's the point of this restriction?
> ADDED: I suppose one could argue that the Bar exam is a bit like that but that's a credential as are degrees which are not necessarily all that overlapping with the real world.
Yeah, I think most other professions where one might imagine them doing something like a stereotypical software interview, use that instead: a credential obtained by a (perhaps very difficult!) test, maybe with periodic re-tests or required refresher courses to keep the credential valid.
I have a suspicion that a big part of why top-comp software companies keep their interviews so incredibly unpleasant has more to do with discouraging job-hopping among them (so, suppressing wages) than with its being the best process for hiring good developers.
> The essential convention was that a first person would not expect to see a second person in the second's own home (unless invited or introduced) without having first left his visiting card at the second's home. Upon leaving the card, the first would not expect to be admitted initially but instead might receive a card at his own home in response from the second. This would serve as a signal that a personal visit and meeting at home would be welcome. On the other hand, if no card was forthcoming, or if a card was sent in an envelope, a personal visit was thereby discouraged.
and also:
> Sample lady's visiting card, specifying an "At Home" day
> The "At Home" day was a social custom in Victorian Britain, where women of gentle status would receive visitors on a specific day of the week.
I expect this will be less noted and well-remembered than several other things they've done or are doing. This matters a lot to academics and science nerds, but won't be as big a deal to the general public as even something like the student loan forgiveness effort.
These kinds of things rarely include the people in charge—maybe middle-managers, sometimes, but not the actual big shots. See also:
- Open office plans, except the bosses get offices (there are exceptions, as with any of these other points, but that's the norm)
- Drug testing.
- Anti-moonlighting rules or other onerous contractural restrictions or claims on time off (these kinds of things apply to higher-ups more often than the other two, but it's still common for them to be universal for the "peons" while the C-suite is allowed to have their hands in several pies at once)
In general, being less-surveilled and less-restricted at work (and off work) is a perk of higher-status positions in a company. It's a social class thing, essentially. This tendency predates computerized surveillance.
> Prior to the finger program, the only way to get this information was with a who program that showed IDs and terminal line numbers (the server's internal number of the communication line, over which the user's terminal is connected) for logged-in users. Earnest named his program after the idea that people would run their fingers down the who list to find what they were looking for.[1]
> The term "finger" has a definition of "to snitch" or "to identify"
I sometimes wonder about folks in the not-so-distant past of the 1980s and earlier, who seemed to be able to operate just fine with words that had both ordinary and indecent meanings, without feeling the need to call out (or else, defensively avoid) every case in which the former might be mistaken for the latter if someone were trying really hard make such a mistake.
All I can figure is a rise in lazy accidental-pun/double-entendre humor on various TV shows made us hyper-sensitive to it (think: Beavis and Butthead, for an early example). Call it the "that's what she said" effect.
macOS's is discoverable by the very action one might take trying to find the cursor even without knowing about it, while Windows' isn't. Source: I found the macOS one by accident, but didn't know about the Windows double-tap-ctrl thing until reading your post. Most likely I'll forget the ctrl trick by tomorrow, but shake-the-mouse was stuck in my brain the instant I found it.
Vonnegut wrote some about the effect of recording technology & mass media on the value of individual artistic talent—in short, that it all but obliterated both the monetary and, perhaps more importantly, social value of all but (literally) world-class skill. Fewer sing-alongs around the piano at home. No more small-time performance troupes (say, vaudevillians) making enough money to get by. That uncle who's an amazing story-teller just can't compete with radio programs, so the social value of that skill plummets, and it's like that for every medium that puts the ordinary in competition, if you will, with not just the world's most talented people, but, as fields advance, with entire teams dedicated to making those already-top-notch folks seem better-than-human.
The benefits of this are clear, but the problem is that artistic expression and being able to receive small-scale rewards and genuine encouragement—at least in one's family or social circle—for even minor talent seem to be very healthy and fulfilling things for people to do. Taking that away came at an ongoing cost that none of the beneficiaries of that change had to pay. A kind of social negative externality.
Relatedly, consider the sections of Graeber's Bullshit Jobs where he treats of the sort of work people tend to find fulfilling or are otherwise proud to do, or are very interested in doing (sometimes to the point that supply of eager workers badly exceeds demand and pay is through the floor, as in e.g. most roles related to writing or publishing). What kind of work is it? Mainly plainly pro-social work (to take one of Graeber's examples, the disaster-relief side of what the US military does, which is by no coincidence often heavily featured in their recruiting advertising; or teaching, for another one) or: creative, artistic work.
Graeber notes an apparent trend whereby these jobs tend to pay pretty poorly either due to the aforementioned over-supply of interested workers, or because there's some societal expectation that you ought to just be glad to have a job that's obviously-good and accept the sacrifice of poor pay, and that you must be bad at it or otherwise unsuited if you want to make actual money doing it (teaching's a major case of the latter—I've seen that "if you care about being paid well you must be a bad teacher" POV, and the related "if we raised teacher pay it'd result in worse teachers", advanced on this very site, more than once—it's super-common).
People really, really want plainly-good and/or creative jobs, but those don't pay worth a damn unless you're at the tip-top, either of talent level, or of some organization. This seems like another blow to the creative category of desirable jobs, at least.
My point is: I wonder and worry about the effect this latest wave of AI art (in a broad sense—music and writing, too) generation is going to have on already-endangered basic human needs to feel useful and wanted, and to act creatively and be appreciated for it by those they're close to. There's already a gulf between the among-the-best-in-the-world art we actually enjoy and, should our friends present their creations, how we "enjoy" those these days, with the latter being much closer to how a parent enjoys their child's art, and everyone involved knows it. Used to be, hobby-level artistic talent and effort was useful and valuable to others in one's life. Now, that stuff's just for yourself, and others indulge you, at best.
Why, with this tech, you can't even get by doing very-custom art, such that the customization, rather than the already-devalued-to-almost-nothing skill itself, is what delivers the value. Now the customization is practically free, too, and most anyone can do it.
Getting real last-nail-in-the-coffin vibes from all this. I'm sure it'll enable some cool things, but I can't help but think we're exchanging some novelty and a certain kind of improved productivity, for the loss of the last shreds of a fundamental part of our humanity. I wonder if we'd do this (among other things) if we could charge the various players a fair value for harm to social and psychological well-being that happens as a side-effect of their "disruption"—alas, that pool's a free-for-all to piss in all one likes, in the name of profit (see also: advertising)
I don't think that effect will persist, and before long it'll be about as impressive as using a "meme generator" site. Already heading that way in some settings, as far as I can tell.
> Having experience in PF1 and D&D, I've tried to look into alternative systems, which can be characterized as "rules light" to ease newbies' pickup stress.
D&D 5e and PF 2 both do a pretty good job of:
1) Streamlining and simplifying rules, and
2) Making battles less dull for whoever's not currently taking their turn,
which helps a lot with a couple weak spots in those systems.
I'd still not call them "rules light", but they are a lot more approachable than before.
Your kids will want to see other people in those childhood photos, more than they'll want to see 100,000 photos of their own faces. You, when you were younger; their now-gone grandparents; that friend they had for two grades before one of them moved ("OMG I totally forgot about them!"), that kind of thing. You'll get way more than enough photos of your kids, so far as your kids themselves are concerned, without even trying—it's the others you may neglect.
Now, for your own purposes, yeah, you probably want the photos of your kids more than anyone else who might get in the shots—but try to think of them and snap one or two of other people in the room, when you're taking e.g. birthday photos, or pan around from time to time when shooting videos of them. They'll eventually get more enjoyment out of the stuff, if you do.
Yeah, PF2's still the more complex of the two, for sure, but PF1 was crufty as hell, and they did succeed in making it a smoother experience than that was, at least. I'd say PF2's what you should go with if you want a modern system but like the idea of older-school RPGs, without necessarily loving the reality of them—it gives you some of the crunch of traditional systems with some QOL improvements and some of the dead weight removed, plus some attention to improving the actual experience of play (especially combat, and double-especially combat for non-casters).
Meanwhile, I'd probably tend to point someone interested in D&D but without much knowledge of or opinions about systems, toward 5e—if you don't know you want a somewhat-fiddlier ruleset of something like PF2, then you probably don't want it.
I'd have to move way North to be able to sleep OK without AC. Or maybe to a desert area with high elevation, for those nice, cold nights.
Discovering how much better I sleep when the temp is in the 62-66F range has made a big difference. I'd hate to go back to trying to sleep with temps over 68F, certainly. Over 72F and I basically can't sleep. I mean I'm sure I'd acclimate to some degree, but it's not like I was already used to sleeping in colder temps before I tried it, so I'm pretty sure that range is absolutely better for me, regardless of what I'm accustomed to or able to tolerate.
My dad remembers when they finally got electricity and a (party) phone line—some time in the 1960s.
I'm not sure they got municipal water service until after he was out of the house, probably in the 70s. They used hand-pumped well water, and had an outhouse.
From his stories about his childhood and young adulthood, it seems lots of poor folks and especially the rural poor were still kickin' it like it was the 1920s, well into the second half of the 20th century.
Well, another way I can sleep OK in most any condition (short of becoming badly sleep-deprived) is to get a ton of physical activity during the day. Wear myself right the hell out, so I'm dead-tired by sundown. I bet physical activity levels were way, way higher pre-AC, too.
I've dabbled in fasting and found my ability to do it comfortably varies a ton day-to-day. Some days I'm craving food by noon and/or am very tired and practically can't think at all by 3PM (this is pretty bad on work days, obviously, but not really great on any day). Other days I can go from wake-up to lie-down without a single bite of food, and hardly notice a thing. I can't tell in advance which it's going to be, so rather than sticking hard to a fasting schedule, when I'm in the mood to do it I'll pick days to try, and if at some point it becomes clear I'm gonna wreck my whole day if I persist, I bail and eat a meal. Ends up being about 50/50, days that work out and days that don't, if I try a couple days per week.
Sometimes I can approach 48 hours before I start to feel bad, other times I can't even make it one day before I'm a mess and can hardly function. I'm sure the difference has to do with blood sugar timing and/or what I ate the day before or something.
Maybe, but I'd be careful about generalizing that kind of thing. Even stuff like consistently applying sunscreen is heavily dependent on class membership, i.e. it's "just what you do, obviously, any time you're going to be outside at all, of course we always have sunscreen on-hand and our kids have been taught to apply it as routinely and consistently as tooth-brushing—wait, why are you asking, doesn't everyone do that?" in some circles, while in other (larger) circles they only break it out for beach or lake days, if at all.
I expect the described behavior is similarly unevenly distributed.
It's pretty rare for our nighttime lows to be above 72, even in the Summer, and I live in a fairly hot part of the US (though not the South proper). Even when it's in the high 90s or low 100s during the day, which is typically only a few weeks a year, usually it still gets down to 75-78 at night. We're in the high-80s this week and nighttime lows will be as low as 58F.
Spring nighttime temps are more like 35-55.
The main trouble's the damn humidity. A very-humid 72 still feels gross. And maybe I could sleep like a baby on a night when it gets down to 60, but if that's a spring or summer night I'll wake up nasty and covered in sweat unless I've got some serious dehumidification going (that's what happens when I go camping—might be cold enough to sleep well, but unless it's Winter or late Fall the high humidity still has unpleasant effects). Plus you get mildew and mold problems if you let a modern, very-sealed-up (so, efficient for AC) house get very humid too often, so it's better to just dehumidify with the AC even if the outdoor temp matches what I'd like it to be indoors.
The top-end players operate way outside the norms for mid-market (as in, the "trimodal" software developer comp graph). Among the 8 or so software job offers I've accepted in my career, I don't think I've ever had one take more than 7 calendar days from initial contact to an offer.
I do see some paying mid-market rates and taking a month or more for their process. I bet they believe it's very hard to find software developers.
> they've been working that way for every other medium. Some movies are good, some are full of sex and gore. Parents have the responsibility to keep their kids from watching movies with content they don't want their kids to see.
Movie theaters and video rental places tend to refuse to serve minors for X or R material, and (if they look quite young) even PG-13 films. Stores may and often do refuse to sell R-rated films to minors. It may require active parental assistance for a minor to see those kinds of things (before the Internet, anyway—which is kinda the point of this whole discussion) and, even if enforcement is imperfect, surely serves to limit how much of that material even a very motivated kid can practically see (again, pre Internet, I mean). Also, kids have to get to those kinds of places, which can be pretty damn hard for them to do without a parent at least knowing they're out doing something, if not specifically what.
Broadcast TV stations risk FCC action if they show anything too outrageous when kids may be around, and have further restrictions even in night-time hours.
These behaviors are due to a combination of actual government regulation, and ongoing or historical credible threats of regulation if these industries didn't police themselves well-enough, which prompted the creation and enforcement of things like the MPAA's rating system (plus some now-defunct frameworks like the Comic Books Code or the Hays Code, both of which were much stricter than anything we'd likely accept these days—but for the "what's historically been within the Overton window of the freedom-loving United States?" perspective, those aren't that old and did co-exist with and apply to modern mass media, so still have some relevance).
The closest analog we have that I can think of is cable TV, since it's in the home and offers a whole lot of content, and even that tends to self-censor to a substantial degree and doesn't offer many of the worst things the Internet does at all—even on premium channels, which are another thing an adult has to actively work to bring into their house, totally separable from the rest of what cable offers.
Libraries are less-restricted and librarians seem to enjoy providing things a bit subversive (which is great) but I bet even lots of librarians would ask to talk to a parent before lending out certain books, let alone R-rated films, to young kids.
Support from outside entities—including, and largely, due to government action or threat of same—for parents to control what their kids see and hear is, as far as I can tell, the norm since fairly early in the days of modern mass media.
It's the Internet's model that's an aberration, requiring parents to take active steps to keep prevent kids from seeing hardcore porn or extreme violence or whatever on the same device they have to use to do homework, rather than having to take active steps to enable seeing those things, as they'd have to in most other contexts. Isn't it? Which doesn't necessarily mean these kinds of measures are a good idea, but I don't think "parents have always had to actively work to keep their kids from being exposed to awful stuff without substantial help from government and the private sector" really holds up, unless I'm missing something.
The Internet's unprecedented in its reach and being something that's basically required in a modern household, and required to allow kids some access to (again, it's not really optional for school anymore), but even other far less necessary media have had effective, if not perfectly iron-clad and universal, age restrictions imposed by businesses and the government. Right?
> that isn't a law, movie ratings are a voluntary system designed to help parents make smarter choices about what they will allow their kids to see.
As I covered, most (all?) "voluntary" mass media industry regulation schemes have been much more of an outcome of "sort your shit out to our satisfaction or we'll regulate you into the Earth's core" grumbling from government, than actually voluntary.
Part of the trouble here is there's just nothing comparable to the Internet. The default before was "parents will have to work to enable their kids to access questionable material", not "parents will have to work (really, really hard) to keep their kids from accessing questionable material, including possibly by accident, and maybe stuff pushed on them by some god-awful 'algorithm' trying to radicalize them or push them into some other harmful rabbit-hole because it helps with 'engagement' or some other dumb-assed metric".
Again, the closest thing I can think of is cable, and that had a much more limited set of content and kept the adult stuff mostly opt-in (so, again, active effort required to enable it), plus cable TV was never 1% as valuable for getting by in modern society as Internet access is—the easy answer of "just don't pay for cable" doesn't apply to the Internet, and hasn't for more than a decade.
However, even with media that are far easier to keep out of the home, government pushes for regulation, and effective imposition of such regulation—de jure or, in fear of what de jure might look like, de facto—has been the norm. Much of this absolutely applied to what adults could access (see, again, the Hays or CCA regulatory regimes). As for "scanning their face every time they access a website"—in earlier situations in which a kid might access some piece of media a parent hadn't deliberately invited into their home, and in which some business was involved, everyone did get a face-scan, by the flesh-and-blood clerk, and since 1990 or so those situations almost certainly also involve being recorded on multiple CCTV cameras (and these days, having all that uploaded to god-knows-where and maybe even having face recognition applied to it—ugh, the Internet was such a bad idea)
Again, part of the trouble with these analogies is there's nothing comparable to the Internet. How do you have a clerk judge whether a person's face looks old enough, at "web scale" and before your web server sends a 200 response? You can't, really, but that doesn't mean something of that sort isn't typical practically everywhere but the Internet. The closest thing you can realistically do is automate that process, instead, to bring it back into something resembling the past norms.
This is, I repeat, not necessarily a defense of this kind of legislation as a good idea–I just don't think it actually is a deviation from what was the overwhelming norm for how society operated in most of the 20th century. The Internet free-for-all model is what's the odd-man-out.
Sure, and that approach may be the right one in this new world. I do reject the idea that policing kids' Internet usage is anything like what parents have always had to do, though—it's a totally new challenge on top of all the ones that already existed.
Before, the usual way to keep your kids from seeing (very much, anyway—no system is fool-proof) porn or brutal violence or from encountering any excessively-weird or dangerous subcultures was basically do nothing—if you didn't try to bring that stuff into contact with your kids, odds were they'd encounter very little of it, and likely none unless they were themselves trying to find it. The Internet's flipped that around, so the default is all kinds of horrible stuff being not just available, but possibly pushed at people in your household, including kids, unless you actively work to avoid it.
The norm in earlier cases when others have some say in whether your kids can access such material, is that they're either prohibited by law from providing it, or else are following some prodded-into-existence-by-government industry guidelines that largely prevent access to it by kids. Parents didn't need to spend much time or attention policing that kind of thing, previously.
> It's getting much much better but performance is only "identical to desktop" if you ignore anything about its resource usage or speed increases in processors over the past decades.
The high power use is what kills me. That and input lag. Fix those and I'd give way fewer shits that an Electron app eats 10x the memory that's remotely justifiable by what it's doing, and more like 20-100x what a well-made desktop program would for the same purpose.
[EDIT] Yeah, I know, high power use and input lag are in part because Webtech disrespects things like system memory, so in practice I'm not going to see one fixed without the other.
Everyone I know with a desktop computer installs & uses tons of programs that aren't Chrome. And most of them aren't programmers or other flavors of computer nerd.
The non-nerds also absolutely notice when some stupid Electron chat app makes their laptop hot and takes their battery life from 16 hours to 2.5 hours (ahem, Discord when using voice and/or video chat). Proponents of that tech claim normal users don't notice, but 1) they definitely, 100% do, and 2) more than one might think are even able to figure out which app, specifically, is responsible, not just "my machine's slow and battery's dying fast and I don't know why" (it's usually not exactly rocket science).
There aren't a lot of other contexts in which I can effectively make $500-2000 for being willing to talk to an annoying, but harmless, person for an hour while saying "no" a lot.
> One thing I don't like about Apple's approach to security is locking the user out, making the OS like a black box.
You can still turn off an awful lot of the security features in macOS. Some require a reboot, but still, the option's there for developers and power-users, if they prefer or require riskier operation.
You can turn off most of the stuff that's keeping it out of "your control". I write "most" only to hedge—I'm not aware of any that you can't (though there may be some).
> There is no way for me to put my own configuration in the system and still have it persist. For example I change things in sshd_config (to turn off password auth), and PAM.
Does putting your custom options in something like:
/etc/ssh/sshd_config.d/disable-passwords.conf
no longer allow custom sshd config to survive updates? It's like if you're configuring daemons on, say, Ubuntu the "right way" so you don't get a ton of those prompts during apt-updates asking you if you want to accept the maintainer's config file or roll the dice and keep your own.
I've extensively used: apt/dpkg (Debian and Ubuntu), rpm (Mandrake, Red Hat, Fedora), portage (Gentoo), and MacPorts. I also have some experience with package management on Void, Arch, and FreeBSD. I wanna say I used some unofficial package manager on BeOS back in the day, too, and I'm pretty sure QNX had one though I don't remember much about those.
HomeBrew is my favorite of them, overall. Though Portage is pretty damn great, for what it is.
Last couple times I got tons of beachballs system-wide for no clear reason, it was because I was doing Android dev and had the emulator running, or, earlier, because I hadn't yet switched to Safari. Both FF and Chrome did that to me, though Chrome was slightly better about it.
> The varying ‘effectiveness’ rates miss the most important point: The vaccines were all 100 per cent effective in the vaccine trials in stopping hospitalisations and death.
Only mention of that "100 percent" in it.
Is that wrong? Were there hospitalizations or deaths among the vaccinated in the vaccine trials?
Agreed the headline sucks, but reporting in general is terrible so that's no surprise.
[EDIT] Oh there's also:
> The numbers Americans should be emphasising are that all three vaccines have proven 100 per cent effective at preventing deaths. The risk of hospitalisation also plummets to virtually zero for people who receive the vaccine.
But the article's so chopped-up and shitty that it's hard to tell what the context of this was. Probably similar to the other quote above.
I've had pain killers prescribed maybe a half-dozen times and the only one for which pot wouldn't have been at least as good was the time I had fairly serious abdominal surgery—and even in that case, I'm not 100% sure weed wouldn't have done the job. Plus, they usually prescribe me way more than I want or need—I may take one or two pills, and they prescribe me six or eight or ten—which inflates sales for these companies while doing no actual good.
For most of those cases (plus some times when I don't/can't get a prescription but still feel really bad—which makes weed a super-hero because I don't have to do the "will I still feel bad enough in 24 hours that it's worth losing 3 miserable hours to driving and waiting rooms and lines, trying to get a prescription?" gamble, I just already have some on hand or can go out and get it in like 30 minutes, guaranteed) I really just need something that can let me sleep at night. Weed does that just fine unless the pain or discomfort is really extreme.
You can install Homebrew other places, too, for that matter.
I'd only tried NixOS (bounced off, couldn't get X-Window working even following tutorials to the letter) not Nix on macOS. "Must be installed in a specific, root directory" is a hard no for me, when it comes to add-on package managers. That's one hell of an odor.
The fundamental problem with any of these p2p-hosting solutions is that most devices folks use these days are battery-powered. Any solution that requires a persistent connection or intermittent network wake-ups to achieve even semi-decent performance won't make it, and if you overcome that hurdle you still won't be able to count on most of the devices downloading content to also serve content. No-one's gonna sacrifice battery life to make the distributed Web dream happen.
They seem a lot more useful for internal infra of hosting providers, though for IPFS in particular I expect the performance isn't consistent enough to be a suitable solution for most of those use cases.
Yeah if you made a pie-chart of headaches for small business owners I'm pretty sure like 25-30% of the chart would just be "healthcare" and that'd be the largest single slice. It's kind of a miracle that small businesses do as well as the do in the US, considering how incredibly bad our healthcare system is for them.
I've never seen a wall framed with 2x6s because it had plumbing in it. It's all 2x4 regardless, here—you almost never see a 2x6 wall. Is that a California thing?
ACA plan quality varies a ton based on where you are, which can make it either an OK option, or something you only use as a last resort and try to get off of ASAP.
In my state, there are zero individual plans offered outside the exchange—you call insurers, you call brokers, they'll all tell you the same thing, that every company is completely out of the individual market here unless they're on the exchange—and most years only two providers offer ACA plans, and 100% of those plans have terrible "networks" (which hospitals/doctors/pharmacies/urgent-cares/testing-facilities accept the insurance, for non-US readers).
It's literally impossible to buy an actually-good individual plan in my state. Even middling employer-offered insurance is usually overall-better than the "platinum" ACA plans. I mean, I assume the very-rich have some concierge-type options that are good, but for normal people, having to buy individual insurance here means definitely having bad insurance, even if you pay a ton.
Red state in the Midwest that uses the federal exchange. I understand things are much better for ACA plans—both prices and plan quality—in states that have taken healthcare and the ACA seriously (so, mostly blue states) but that's only... like, half of all states, at best. Granted, that's our own stupid state's fault, but still.
> The US has a very progressive income taxation system, far more so than most of Europe (including all of Scandinavia). The US middle class and below pay exceptionally low income taxes, the burden is overwhelmingly carried by the higher income brackets already.
We have progressive wage income taxes (much less so if you include highly-regressive FICA contributions at ~7.5% for W2 and ~15% for 1099 employees—but still) but overall the US income tax scheme is quite regressive, thanks to how capital gains taxes work, which leads to things like Warren Buffet observing that he enjoys a lower tax rate than his secretary does.
> My mother needs a phone and left to her own devices her friends of the same age would probably convince her to buy a cheap Android (which would end up frustrating her eventually), or she just wouldn’t even buy a phone at all.
My parents do that. Have plenty of money but just can't bring themselves to spend more than ~$300 on a phone. And, to be fair, if they spent double or even triple that but still bought Android they're correct that it wouldn't be an improvement, since the UI's the main problem.
> So I buy her phones.
Having watched my parents' experience over the years, I assure you you're doing the right thing.
One of the more memorable ah-ha moments or me in college was embarrassingly simple: my IPE professor leaned against the desk while the class was discussing something, and quietly asked, "why have an economy?"
Meanwhile, most of us white folk have to work to "sound white", too, when we want/need to, because our usual accent doesn't "sound white" in the way that's meant. In plenty of cases this is quite far from our ordinary, or at least childhood (some of us all but obliterate it by adulthood, on purpose) accent. It's a poor term in this kind of context, and better ones exist.
"General American is thus sometimes associated with the speech of North American radio and television announcers, promoted as prestigious in their industry,[45][46] where it is sometimes called "Broadcast English"[47] "Network English",[4][48][49][50] or "Network Standard".[2][49][51] Instructional classes in the United States that promise "accent reduction", "accent modification", or "accent neutralization" usually attempt to teach General American patterns.[citation needed] Television journalist Linda Ellerbee states that "in television you are not supposed to sound like you're from anywhere",[52] and political comedian Stephen Colbert says he consciously avoided developing a Southern American accent in response to media portrayals of Southerners as stupid and uneducated.[45][46]"
(Wikipedia, "General American English")
That's what's intended. Not "white". They surely aren't trying to make them sound like most of the American white people I know who haven't deliberately trained away their natural accent & dialect—and I don't even live in the South!