I agree with this. Good software updates that I've resisted in the past have often surprised me at how much I prefer them once I acclimate to them.
The alternate line of thinking, that nothing big about the UI or system can ever change unless a customer specifically asks for it, seems to conveniently pop up in teams with enormous technical debt that make big changes unthinkable for managers. I'm guessing this is the case for QuickBooks - given the years they've spent supporting an inconsistent system, apparently with a goal of preventing it from becoming consistent, while also inserting customer-requested features.
I have a hunch that if you're afraid of inconveniencing customers by making the software better, then the real problem is that you're afraid users won't stick around, and it's because you think that you haven't taken good enough care of them to weather the bump of re-acclimation. Compare the unchanging-system strategy to Snapchat's. Snapchat regularly improves their software (and users complain about it until they re-acclimate), because they are confident that users will stick around through updates; this confidence comes from the fact that they know they do a good job serving their users. So, my hunch is that when this unchanging-system strategy emerges, it's likely because you've already spent a long time disappointing users and eroding customer confidence.
> You know what might change this? I gave a talk on this to Stanford alumni and afterward a lawyer came up to me and said there are going to be lawsuits.
This won't happen as long as the worst offenders can push individual arbitration agreements. The Supreme Court heard arguments about them recently, and a decision on whether they are legal is expected soon.
On the other hand, electronic health records should make it easy to show with statistics if a workplace causes chronic disease, relative its peers. We just need to add employer information to the standard information that hospitals collect. Certainly a group of university hospitals would be interested in seeing this data, and could organize an initiative to collect it.
Edit: Actually, a way to link massive amounts of healthcare data to patients' employers might already be available. Medical records often store insurance information, which can be used to figure out a patient's employer if they are the subscriber.
I had hoped "software engineer" would have meant more of a focus on making systems reliable, vs "software developer," but in my experience it's only a trendy job title preference.
> I was fooled into joining because I met with a data scientist and an "experienced web developer" without actually looking at their code.
Is it typical to be able to look at a company's code before signing up? After having a similar experience, I wish I could make that a standard part of my interviewing process. (Brief backstory: I would never in 1000 years have taken my first development job if I had seen the code first, but HR and the carefully selected engineers who could tell positive stories gave me a false impression of what was going on.)
If 1) there's a solid engineering case to be made that this code is bad, 2) the code does something that's important, and 3) the company doesn't want to fix it, then you can be confident that the company also isn't interested in investing in the things that you as an employee would want them to be. I would expect work in general to be unenriching, raises to be smaller than at other places, and other benefits to be not as good. This could be an early sign that spending time with this employer is bad for your professional growth.
Edit: definitely kick the tires a bit by seeing if this situation is typical for the company, or if it's an outlier.
I've wondered about this question, and I'm not sure it correlates that way.
One company I worked for was known for being one of the lower cost vendors in its industry, and I experienced a pervasive back-stabbing culture, which I attributed to there being many employees who were especially fearful of losing the only job in the industry that they had been able to get. So I think in a lot of cases, the lower paying companies might have employees who are more desparate, and management who is used to having more-than-typical negotiating leverage, which they do exercise.
Alternatively, companies who want to exploit workers can offer lower-than-expected wages, to make sure they end up only with employees who can be talked into going along with bad situations.
Crohn's is another inflammatory disease that has long been suspected by some researchers to be infectious. The company RedHill Biopharma just showed that a combination of generic antibiotics could be as effective as Humira, the highest gross-selling drug in the world, at treating the condition.
I keep seeing news about severe impacts of chronic infections that we aren't good at testing for, and aren't good at treating. Gut microbes that cause cancer, only just recently deciding that HPV's cancer risk to men was significant, and chronic Lyme disease are some examples.
This article cites the historic example of peptic ulcers, which the medical community didn't believe were caused by an infection, until after Barry Marshall infected himself with H. pylori in 1984, developed disease, and treated himself for it with antibiotics. Marshall and his partner won a Nobel Prize for this work.
Mutopiaproject and LilyPond are great. I used Mutopiaproject a lot when I was a teenager who enjoyed playing piano. It and imslp changed how I explored music, and helped me develop my interests. I always preferred the LilyPond scores from Mutopiaproject when they were avaialable, because they were usually higher quality. I actually decided to give back to the community by typesetting some scores myself.
The thing that led me to stop contributing was ultimately seeing the progress that was being made toward automated music transcription. I saw this technology as something that would make the process a whole lot faster once it was good enough, and so it wasn't a wise time investment to keep writing out those intricate files by hand, but it also wasn't yet the right time for me to start using that transcription software, either.
At the time, the software that caught my attention was mainly the kind that helps users turn a scan of sheet music into something that could be transformed (with a text editor and some scripts) into a LilyPond source file with far less work than actually typing out all of the LilyPond input yourself.
I can envision software that does this, but it seems like a big project with very uncertain adoption outcomes.
Edit: I think software like this probably already exists, essentially, and definitely if you consider running one of the LilyPond command line conversion tools an acceptable step. The system's main differentiators would be being web-based (which isn't that uncommon anymore for music engraving software), and targeting LilyPond as the best-supported output.
This highlights the tension between wanting to navigate the instrument, easily vs wanting to understand the music easily in terms of scales and tonal theory. Standard music notation is great for understanding the notes in the context of the scales and chords that you're using, while guitar tablature is excellent at telling a guitar player how to play the notes.
As a pianist, I'm certainly not looking for this, but a piano keyboard is designed exactly the same way that standard music notation is; the notes for a C major scale are the default, and then there are the other notes that you can access differently. I would suspect that vocalists would prefer standard notation also, since their ear naturally understands things in terms of the scales that they are used to.
From talking to advanced classical guitar players, I get the sense that standard musical notation is generally preferred over tablature. They know where the notes are already, so they don't need notation that tells them where the notes are. Their chief aim is to play the music in a way that understands the material and treats it well, so for them the standard music notation that provides easy musical understanding is an advantage.
I've programmed some things that worked with music, and it's annoying to have to convert between standard notation and integers that represent unique notes. This proposed notation system might make an ideal specialized notation for programming music applications.
In the dialogues that I've read, the things Socrates tries to figure out are the big questions that possibly don't actually have correct answers, so a disappointing outcome is seemingly inevitable. Also, they are actual attempts to solve these big questions, not to teach something that the teacher already knows the answer to.
If you're using the socratic method to help someone discover how they can apply an if-statement to execute a section of code conditionally, then you're working for a different purpose and to solve something much easier than Socrates was.
Millennials are forced to be more careful, because they're climbing a difficult and important ladder into financial security. Most people I know who aren't looking to marry are just very aware that this is not the right time for them, neither finically nor in relation to achieving their professional goals.
This comment overlooks the fact that the U.S. court system is currently disagreeing with itself over whether the practice of forced individual arbitration is illegal or not. The last time SCOTUS had a case about this in I think 2016, they divided sharply by one vote. This is yet another case involving it where the courts who saw it disagreed with each other.
I just looked this up and oh wow, and I don't perceive that this situation could have only happened back then.
In the modern day, a lot of people with a lot of education will scoff at anything that hasn't been produced by science, or even anything that simply wasn't taught to them in medical school. Meanwhile, the medical system is full of inefficiencies and outdated practices because of politics and resistance to accepting or learning newer scientific findings. Worthwhile questions about simple ways to improve things often go unanswered, because nobody has been able or willing to do a large, randomized trial about it.
I think that if it's possible to define a way of operating businesses in a way that doesn't harvest data in a way that's nonessential to the services, then there should be a law requiring this option: to pay out of your pocket directly the amount of revenue the company would have expected to make, in exchange for the company not doing this data collection. But it seems difficult to get to such a definition. I think this law would be very popular.
Looking back I could have talked about Zika here. How many childhood viruses are we sure don't cause down-the-road issues like cancer? Most of the time when you get a random virus, the doctor has no clue which one it is and doesn't test to find out. I can remember being told by nurses at the school "there's a virus going around," and that's all we knew: there was some virus, and everyone was getting it. We just assume that these things are harmless and irrelevant once you recover from them.
There is huge room for medicine to be improved, but actually the reason for that is there is a lot of resistance to changing things in major ways despite all the things that could be done. A lot of software companies know that their key to staying on top is to improve things fast enough that a group of founders in a co-working space won't put them out of business. Hospitals don't have this pressure at all; they just have to keep providing the acceptable services that they always have, and people will still go there. I've been pressured into unnecessary procedures and even given (multiple!) sales pitches for surgeries that I haven't needed or wanted, because the hospital makes huge amounts of money from it, and consumers aren't savvy enough to know when their doctor is just trying to pad his profitability figures.
Plus, when you write software that disrupts an industry, your employees are still software developers who work and function like software developers. If you want to change healthcare, you need to change how the doctors operate.
Oh, I forgot an important detail. I should have added another aim I would want is that as a result of paying this money, you wouldn't receive any advertisements from the service.
This question for me is more about the ethics of trimming plants than about bonsai specifically. Was the original question implying a more general metaphor (not specific to plants) like the one you went into? I think trimming a plant with respect for the plant's health is in line with what plants are adapted to experience, and won't cause suffering or loss of life quality.
> There are whole teams in health care and applications from suites like Epic ... just for refining billing.
To anyone interested in AI in healthcare, I suspect that datasets of procedure and diagnosis billing codes could be some of the most accurate and immediately usable of their size.
This article calls Apple’s commitment to security and privacy a “superficial jihad” simply on the basis that regulating data collection might weaken Apple’s competitors.
This is a bad line of reasoning. An organization can care about X intrinsically despite the fact that X has other benefits. The claimed other benefits also aren’t useful in this case; if Apple’s competitors see reduced value in data collection, then it’s even more important for them to excel in the ways that are closer to Apple’s strategy, increasing Apple’s competition.
Besides, I’ve seen Apple dedicate loads of engineering efforts prioritizing security and privacy over the years, while also passing over easy money to be had from their users’ data. It appears to be a sincere and long-established commitment.
SEEKING WORK, Web Developer, Remote, based in the U.S.
I'm a software developer with an eye for good system design. Two of my passions are expressing powerful solutions in simple ways, and working with modern technologies that facilitate clean approaches to software development.
I focus on web development, though I have experience in various technologies.
---
- Full-stack MERN developer with years of software development experience, who is newish to freelancing.
- Significant experience also in Python, Java, C, and relational databases.
- Good at building new products and maintaining existing ones.
- Available to work in teams or individually.
- Competitive rates.
---
For my email address and LinkedIn, please see my profile.
For me as a vegetarian, getting into Indian food was a dietary breakthrough. Vegetarian Indian food has already been developed for thousands of years to be tasty and meet dietary requirements.
I grew up on typical American cuisine and then became vegetarian, and I was disappointed by a lot of vegetarian recipes that were designed to make sense to people with my culinary background. However, I recognize that switching someone's cuisine is a big thing to ask for; reducing meat consumption will go a lot smoother if it doesn't mean switching to a new culinary tradition - which is why cookbooks featuring those recipes that I didn't like are so popular.
When I tried on an Apple Watch at an Apple Store, I noticed that the heart rate it displayed was definitely different from mine.
I mentioned it to the employee who was manning that table, and she told me the demo watches had some pre-programmed data for demo purposes.
I was puzzled by this since heart rate monitoring is a key feature users care about. It occurred to me that heart rate data of everyone who has tried on the watch might count as the kind of medical information you shouldn’t publish on demo watches.
Imagine ridiculous stories like “CEO of X’s heart rate was 130bpm when he tried on the new Apple Watch, says journalist who tried it on right after.”
The Apple Watch is in a different regulatory space than the devices listed in that source, since it's cleared as a class II medical device by the FDA. HIPAA laws will apply in some use situations, so their lawyers will have thought about it and made recommendations. (Lawyers aside, the feelings of both the medical community and Apple are generally that you shouldn't leak customer information. This seems like the kind of data that people would accidentally leave behind at the store without thinking about it, which doesn't feel right.)
That makes sense. Employers save themselves a lot of employee dissatisfaction through honesty and transparency during the hiring process. If they misrepresent themselves, they end up with employees who aren't a good match as a result of decisions made on bad information, and who feel cheated to boot.
Edit: Some companies misrepresent their cultures because they know their actual behavior is unattractive to workers. Bolstering Glassdoor reviews by encouraging unusually satisfied (or otherwise motivated) employees is just another form of this. The strategy is successful at tricking employees into accepting poor working conditions while also demonstrating that the employer isn't worth trusting.
I usually see this idea of a heroic great artist (compared to artists who presumably don’t have what it takes to merit being mentioned) in discussions around classical music. When people talk about visual art, there is more focus on the quality of the work, and the artist’s distinct voice. Consider an example: a work by Picasso looks like Picasso made it, and that’s what gives it its value. By contrast, derivative works that merely copy another artist’s style are usually not held in high esteem. Chopin’s work scores very well according to quality, and his distinct voice usually makes it obvious that the piece you’re hearing is a Chopin. The visual arts are a lot better at allowing more artists to become established than the classical music world is for composers.
This may be a consequence of the properties of the creations. People who want to listen to a concerto will share the experience with many others, and typically won’t be able to buy it all just for themselves. People who want to be involved in the contemporary art world can and often do buy physical works of art for themselves, and it doesn’t matter whether or not the artist could convince a theater of people to pay to sit down together and appreciate it for 30 minutes. If art patrons want to buy physical pieces for themselves, you need a bigger pool of artists to serve them; but if you want to fill opera houses, it’s better to have a few artists that everybody has agreed to like.
> There's a lot of great stuff in classical music that's by unheard-of, "minor" composers, and even some pieces the composer of which is entirely unknown, and we can only make guesses as to their rightful "attribution".
It was careless of me to say that works by Chopin sound like they are obviously by him. Recently, a friend played a piece by him that I was unfamiliar with, without saying who wrote it, and I told him it sounded like an unusual imitation of Chopin. I'd still say that his distinct voice was present in that piece, and that it's there in just about everything he published.
The book takes music theory as its subject, but explores perception, culture, the meaning of art, innovation, and other related ideas in its lengthy footnotes and tangents. It shows a world of practical considerations that go into art, which many would assume is primarily subjective.
Schoenberg is known as a composer for exploring atonality, and for having many students who became important composers.
From an accessibility standpoint, a design needs to work well in black and white / without color. Color shouldn't be functional in the sense that people need it to use a design easily and safely.
These might be good arguments for letting people fill the ballot in manually, if they wish. Based on the design as I understand it, it seems like users aren't prohibited from printing a blank ballot and taking a pen to it themselves.
> classical music isn't that much fun, compared to so many other options today
I think it's simpler than this; classical music is still pretty fun if you're into it.
People listen to music that they culturally relate to, and the experience of music is a way to express identity. Once, people lived in a time when the classics of classical music were being written and were a current thing. Listening to Chopin or Liszt was a way to express pride for your country, as well as to do what your friends were doing. Now that so much time has passed, classical music has just been crowded out by other music that expresses identity and culture. Classical music for most people is no longer a thing that your friends are doing, and it hasn't been for your parents, either.
My alternate theory about how classical music gained sinister connotations is that it's a cultural marker that expresses a cultural identity, but one that the audience members and their friends don't relate to. It's a convenient way to present the character as an outsider, and disregarding the societal implications for this, it's a convenient tactic for a writer who wants to paint a picture of a character as being sinister or untrustworthy. If you consider the alternatives, you'll see why classical music is an easy go-to in this situation; if you give the villain Coldplay as their theme music, it's going to risk making them feel more relatable.
Trying things based on existing understandings leads to discovery, but so does trying things "just because." It's worth pointing out that the former often involves an existing understanding that isn't actually correct.
Both activities are useful, and they're both rational because they're useful. (Obviously you can't try out every possible thing "just because," so formalized science has to prioritize.)
SEEKING WORK | Web Developer | Remote or Miami, FL
Most of what I do these days involves proposing, implementing, and deploying solutions that combine services and technologies on the front end or with a Node.js server.
The languages and frameworks I'm most familiar with are JavaScript, Node.js, React, Python, and Java.
I have five years of experience as a professional software developer.
Please see my Hacker News profile for contact information.
I might be in the bargaining stage of loss - but I have to think that the architects and engineers who worked on Notre-Dame in the 12th and 13th centuries designed it to be gutted by fire in a way that it could be rebuilt.
My undergrad's main building - definitely nothing comparable to this cathedral, but from a time where fire fighting wasn't that great - went through this three times, and was always restored. It seems like this happened a lot, and was something builders considered.
The alternate line of thinking, that nothing big about the UI or system can ever change unless a customer specifically asks for it, seems to conveniently pop up in teams with enormous technical debt that make big changes unthinkable for managers. I'm guessing this is the case for QuickBooks - given the years they've spent supporting an inconsistent system, apparently with a goal of preventing it from becoming consistent, while also inserting customer-requested features.
I have a hunch that if you're afraid of inconveniencing customers by making the software better, then the real problem is that you're afraid users won't stick around, and it's because you think that you haven't taken good enough care of them to weather the bump of re-acclimation. Compare the unchanging-system strategy to Snapchat's. Snapchat regularly improves their software (and users complain about it until they re-acclimate), because they are confident that users will stick around through updates; this confidence comes from the fact that they know they do a good job serving their users. So, my hunch is that when this unchanging-system strategy emerges, it's likely because you've already spent a long time disappointing users and eroding customer confidence.