Game artists are going to be one of the largest consumers of AI image gen tech in the whole industry because of how much tedious work is required to get a sketch to a 3D model. So I doubt any serious artists in that field would side with you on that statement and many are probably training their own internal models on their own work anyway.
Not an issue if they train an ai model by their own accord. The issue is the prevalent theft among ai companies that simply seal people's work to train proprietary models.
I agree it’s wrong if it’s proprietary. But I think open models trained on scraped data regardless of where it came from are valuable enough to offset any upset about the data.
If this was all locked behind Adobe they could charge 10k a month for access (if you think that sounds ridiculous look at autocad licensing) now it’s free for the first time in decades artists have a chance to own and control the next generation of tools and make them entirely free.
Sad to see short sighted ones bickering about slighted that they were part of the data. Real small picture thinking that will screw them over in the long run.
It's not and plenty of pieces of art only exist in good quality today because of what you think is theft. You're wrong and things are not black and white.
Yea - even the ones Artists do get to share, it's just so much out there now.
If you take a quick look at sites like ArtStation, you'll see thousands upon thousands new artwork added daily.
This is something I've been thinking about lately with regards to the early arcade game developers in Japan, not just in regards to the artwork and music but also the decades of technical knowledge and R&D that have gone into creating bespoke arcade boards. Individual game companies developed or commissioned unique hardware dedicated to playing sometimes a single game in an arcade setting. The MAME devs have done a lot of great work unraveling these boards. Some books and websites have compiled artwork, interviews and technical info from certain arcade developers. I just wish the world could see more from that period of time.
I think about this same kind of thing with regards to the company I work in. We create video hardware and so much of the development knowledge will be lost to the sands of time. This is inevitable but a sad part of any large creative undertakings.
Games weren't getting individual custom CPUs or anything that exotic, but from roughly 1980-2000 the industry was still riding the peak of Moore's Law, where chips were getting faster and cheaper seemingly every day, and arcade developers were constantly trying to leapfrog each other. Each game system really only needed to run a single game, so there was no real barrier to having unique boards per game -- if a particular game needed a few extra RAM chips, or the CPU maybe needed a higher clock, there wasn't much barrier to that.
NOTE: System16 is a treasure trove of information in general, but it looks like their main landing page at https://www.system16.com has some kind of hijack ad situation going on? The links below seem unaffected.
Konami was kind of the poster child for bespoke hardware; they're famous/infamous for seemingly every single game having some kind of weirdo bespoke hardware iteration: https://www.system16.com/museum.php?id=5
Sega had a more typical approach. They would typically have a minimum of several games per arcade system, and sometimes dozens: https://www.system16.com/museum.php?id=1
But, even then, there are lots of variations to the rule. Taito operated more or less like Sega, but (to choose one at random) you might see things like this where one game on this particular board (Cadash) was equipped with a faster CPU than the others: https://www.system16.com/hardware.php?id=652
There are some fun exceptions to what I've written though: truly one-off hardware.
it runs through a variety of Sega Model 2 arcade system games. theoretically there were about a dozen "Model 2" games.
buuuuut, as the video details, the "Model 2" moniker refers to an entire generation of Sega arcade boards and there were significant hardware upgrades and new chips added throughout the lifetime of Model 2. with the result actually being something close to bespoke per-game hardware.
also there's this hilarious (to me at least, from an engineering perspective) look at the extremely unique Popeye arcade hardware, with three different graphics chips each drawing different layers on the screen. that was some bespoke-ass hardware
>and so much of the development knowledge will be lost to the sands of time
This is when you put on your Assange mask and start saving documents to leak later. Capitalism is going to eat itself, along with all of our technical and institutional knowledge, unless people put that continuity of expertise before the company or even their own jobs. One of the many Boomer habits that we simply cannot carry on into the future is, "Retiring without passing on what we know because it's less trouble and more job security."
It got figured out, and people will figure it out once more, if needed. I wouldn't worry too much about it. A lot of it has never been seen as knowledge to be preserved and passed on, but something that somebody did based on a deeper understanding of the subject.
A lot of companies deem the knowledge obsolete. There's no real reason to read 35 year old, 500-page hardware documentation books, or ancient assembly code from 25 year old tapes and floppies... if they even still exist! Much cheaper and easier to re-create the game.
Even the content that made it into the game will never be seen by most users, as most buyers of a video game will never complete your game. But if the ending is not satisfying for those who complete it, they will impact the buying decisions of those who never play through it anyway.
And on top of that:
Before the introduction of Trophies in console games there was no way for game developers to know how far their buyers actually played the game before they stopped and moved on.
On steam you can see % of player that got each achievements for a given game.
I looked at Dark soul and only 31.5% got to Anor Londo(on PC), its crazy because that's barely half the game, there is still a lot of high quality content including some of the stuff that made it so famous left to be experienced.
Does that include people that never even played (or installed the game)? Because I have lots of games I never intended to play but were part of a bundle and I wanted to support the developer.
There are some games that encode an achievement basically for starting the story or finishing an early quest, so if those complimentary achievements aren't 100%, you could make that measurement
I'm surprised the other way. I would not expect 31% of players to make it halfway through Dark Souls. I probably got about 1% through it. Regardless of the "quality", I didn't find it fun.
I still have yet to give Dark Souls a try, I was always hesitant to try soul types games. But I wanna shout out Elden Ring for breaking that cycle for me. Truly amazing game.
If I'd never played Dark Souls, I almost certainly would have played Elden Ring. As it is, I'm sticking to my lifetime boycott of the franchise though.
Yeah, to specifically choose as an example a franchise whose entire "thing" is its murderous difficulty? I'm incredibly unsurprised that most people who try can't be bothered to grind all the way to the end.
My reasoning is that its a massively popular game that was played again and again by all sorts of peoples, that became renowned for their collective efforts to explore every nook and cranny(including cut content still in the files!).
But despite this unusual efforts the completion rate is low like many other big profiles games.
Steam's statistics are very questionable. They routinely "lose" hours of playtime from games in my library and syncing savegames just doesn't work half the time.
Even if that doesn't include the players who never even installed the game, there's a still a fat tail of players who bought it on a sale, tried it for an hour and weren't interested enough to continue because there are just so many other options. Things used to be very different back in the era of cardboard boxes when most people didn't have an abundance of choice and impulse buying games was much less of a thing.
Same on Consoles. The ratio of Players who completed the Story of "The Last of Us" on Playstation (a game praised for its outstanding storytelling), is a mere ~28%.
What's even more interesting: Almost the same stat applies for the PS3 (27,2%) and the PS4-remastered (28%) release.
--> Something caused 72% of all players to NOT complete this game, on BOTH generations of the console. That's a metric you would file as opinionated if your playtesters would tell you that...
My theory is people are buying games on sales.
$15 dollars is loosely equivalent to a two hour movie.
So as long as they put in a couple hours, they feel like they've got their moneys worth.
A game is just distraction for most people, especially as the economy is cratering and other priorities come up.
Yeah, I don't know that the rate of people who "finished" a game is actually very significant.
There are just so many other factors that determine this, unrelated to the game's actual merit.
Cave Story is one of my favorite games ever and I never quite beat that final level. I remember trying it about 10-15 times and then just getting busy with something else in life and I never went back to it.
And I'm okay with that! It was still a wonderful experience! The fact that I only made it 99% of the way through really says nothing about the game itself.
I'm one of the 68.5%. I bought it on a sale, played it for like 30 mins, decided I really just hated that type of gameplay, and never touched it again.
The bigger concern for me is getting to the end of games I actually enjoy. What normally happens is something comes up in life that stops me from playing for a few weeks, and then when I try to pick it up again, I've forgotten how to play!
Agree. The back part of the game is not as well regarded these days by the fanbase.
For my playthrough, I thought certain sections were more annoying than before, but not to a noticeable extent. Besides, an okay set of levels in a FromSoft game is still way ahead of most games.
How much is this just a result of the modern games market being so heavy on sales and things like Humble Bundle?
When I was a kid I put in my best effort to play through games (well, at least before piracy) because they were relatively expensive that had to be "bought" through begging and good grades. Being physical copies there was also a cost to keeping them.
Nowadays I am more likely to just buy a few year old game that I was interested in for like $5 and leave it in my digital library for if I ever have the time to play through it. Only to usually just go back to Factorio.
> most buyers of a video game will never complete your game
Even content that made it into a movie won't be seen by people who didn't watch the movie to the end. And it won't be seen by people who didn't watch the movie at all.
> Even the content that made it into the game will never be seen by most users, as most buyers of a video game will never complete your game. But if the ending is not satisfying for those who complete it, they will impact the buying decisions of those who never play through it anyway.
That's because a lot of gamers will get it in a bundle, discount or as part of different passes. Those aren't the core audience that would even be expected to finish the game.
Similarly, I'd bet that "most people" these days don't finish Lord of the Rings books. Luckly we didn't have product managers telling Tolkien to not finish writing the books while looking at statistics saying that "most people don't finish books".
There are plenty of games I have paid good money for that I didn't finish for one reason or another. Sometimes the game just falls flat of my expectations (free demos went out of fashion about a decade ago), sometimes it offers me a compelling experience for a while, but not through the whole experience. And sometimes life gets in the way. I'm not rich by any measure, but I have enough disposable income that I don't have to force my way through an experience I stopped enjoying.
Similarly there are lots of books I started reading and never finished, and plenty of TV shows I started and dropped at some point. Even some few movies I started and never finished. And that's ok. But unless they are highly recommended I don't start TV shows that were canceled, and similarly a bad ending is a score against me ever purchasing a game or book. There is a good chance the show or game will lose me before I ever get there, but why would I set myself up for a bad experience in case I do like the product enough to get that far?
Quite often there will be games I love, but life gets in the way. Then when you try to go back, especially at higher levels, it's very difficult to get in the pacing/difficulty level so you just don't pick it back up. Even though it's a great game and you enjoy it.
> Even the content that made it into the game will never be seen by most users, as most buyers of a video game will never complete your game. But if the ending is not satisfying for those who complete it, they will impact the buying decisions of those who never play through it anyway.
How is this different from any book, movie, or other form of entertainment?
Case in point: I have not yet watched Game of Thrones, but much of my momentum to watch it was stymied by fan outcry of just how atrocious the final season was. If I do watch it, I will probably stop before I get there.
> How is this different from any book, movie, or other form of entertainment?
I'd say if you spend millions to make a movie to be shown in the cinema you don't expect that by default 50% will walk out before the finale. But even more crucial: If that would happen, your ticket-sales would plummet week-by-week.
> Case in point: I have not yet watched Game of Thrones, but much of my momentum to watch it was stymied by fan outcry of just how atrocious the final season was. If I do watch it, I will probably stop before I get there.
The final season of Game of Thrones was based on the success of the seasons before it, it was produced AFTER Season 1,2,3,4,...
> I'd say if you spend millions to make a movie to be shown in the cinema you don't expect that by default 50% will walk out before the finale. But even more crucial: If that would happen, your ticket-sales would plummet week-by-week.
Maybe walking out in the middle of the movie is a bad analogy, but, it would certainly be applicable to the streaming world, where half your audience might watch Episode 1 of New Netflix series but stop watching well before Episode 10.
Valid in some way, however there is an additional half of users who never launch the game at all. Several high-profile games on steam have global achievement stats where the achievement you get for launching the game / completing the tutorial /creating a character is only earned by at most 60% of people who own the game.
I've purchased some of my favorite games multiple times — both to support the developer, and because I prefer the newer platform and don't want to have to deal with playing the game on the old platform.
But often, after the secondary purchase, I won't actually play the game again. I didn't feel like playing it again in that moment where I purchased it again. I know I will want to replay it one day, which is why I re-bought it. But until I actually do, that second copy will seem to be have been purchased by someone who has never played the game at all.
The vast majority of games aren't bought twice or released. What happens is that people buy bunch of games at the same time and then don't play the third or fourth game.
That's true. But you can look at specific trophies to narrow down the users who actually engage but then still "walk out of your story" (a ratio which of course varies largely from game to game).
Spiderman, one of the high-profile games of PlayStation, has a trophy for each of its 3 Acts of the story.
While 67% of the players finished Act#1, only 49% actually finished Act#3.
So setting completion of Act#1 as a minimum to measure engagement, 27% of those who "sat down" still walked out before the end.
And that is not a sign of failure but a landmark success of that industry. That's quite unique for a product of creativity.
And even more odd: it doesn't even mean that those 27% disliked your product. They might even buy your next creation...
> Maybe walking out in the middle of the movie is a bad analogy, but, it would certainly be applicable to the streaming world, where half your audience might watch Episode 1 of New Netflix series but stop watching well before Episode 10.
What would be "applicable"?
The streaming world of a TV series would be comparable to a streaming world of games. Here the gaming industry is largely selling products of entertainment to customers who pay explicitly FOR THAT product, yet a significant portion of that audience actually never fully consumes it.
That's how it is different. CREATIVELY different. A team of possibly hundreds of people invest years to create a creative product which only a fraction of their paying audience will ever experience in full.
It's like a non-Spotify world where every Artist produces an Album to be sold on CD, knowing that 50% of the paying audience will never hear more than Track 1-4.
> Case in point: I have not yet watched Game of Thrones, but much of my momentum to watch it was stymied by fan outcry of just how atrocious the final season was. If I do watch it, I will probably stop before I get there.
With AI, we might one day be able to remake the last (two) season(s).
Everything leading up to the end was magnificent and breathtakingly fresh. No reason it shouldn't be salvaged.
I bet a lot of movies and shows will get "fixed" by fans in this fashion. We've already seen dozens of Star Wars OT and PT edits, several Lord of the Rings edits, and my take is that this trend will grow to encompass everything as it gets easier to do.
Season 6 was alright, with the ending making up for a largely mediocre season. Season 7 was about as bad as 8, but it's harder to tell because the bad moments weren't character/plot altering payoffs, just teleporting around the map and ignoring depth in favor of spectacle.
I just felt like 6 is where the writing became very amateurish. Iirc, 5 was where source material ran out. It has been a long time so I can't pick out details but I recall the change in quality between 5 and 6 was pretty Stark to me. One thing I remember is that instead of "smart" characters really being smart, everyone else got dumb as hell. Then the smart character would 'win' and another would make a comment basically explaining how smart they were.
I know a lot of people think it was good right until the finale and only dislike it because a certain fan favorite has a "change of character". Except it wasn't a change & was obviously telegraphed since season 3 or 4.
I don't think the finale is that bad. My only objection is that the character didn't seem to have been pushed so far based on the previous episodes. It would have been better if the previous seasons showed even more conflicts internally.
> Before the introduction of Trophies in console games there was no way for game developers to know how far their buyers actually played the game before they stopped and moved on.
It's worse today than it's ever been.
We've reached peak content, people are still playing games from the playstation 1. On hackernews, there was a post about someone translating old japanese playstation 1 or 2 games.
You are starting to see diminishing returns on even the big AAA games. Sony, Epic, etc are starting layoffs.
You are also seeing new trends with Zoomers, where they don't 'play' games but have some sort of weird "meta"/"metaverse" around a game like Five Nights at Freddys.
Lord of the Rings isn't a PS1 game. A lot more people are reading old books than are playing PS1 games. I'd assume most people still playing PS1 games now are using emulators, but for a sense of scale: Something like fifty million more copies of Lord of the Rings have been printed than PlayStations made.
Emulators have hundreds of millions of views on youtube as a rough benchmark.
There is a huge huge amount of interest in the "classics" across a wide amount of multimedia.
This has had the follow-on effect of watering down of video game difficulty over time. It's more important to see the user steadily progressing throughout a game's content than see them getting stuck too long (even if there's an intellectual payoff when they finally figure it out). Many developers have cited this as a reason for changing their game mechanics in favor of ease.
It's arguable if this is a good or bad thing for gaming, but it's unquestionable there's a connection.
This reminds me of efforts I've been a part of to open source libraries and others bits of code at companies large and small. The reality is that most employment contracts consider the employee's considered made for hire, meaning that ownership rights, in whole or large part, are given to the employer. Publishing works carries both risk and a cost to ensure the right works are published under the appropriate terms, and expending this effort is only in the interest of the company as a means of retention (and, ironically, letting your workers display their talent can easily work against retention).
> Over 90% of the studio work I've made over my career is locked away forever
Speaking as a developer, my "locked away" rate is probably >99.9% on a line-by-line basis. Glad to see the concept artists faring better, I suppose.
this kind of question really illustrates techbro level of understanding of art. like damn, it is hopeless.
and some deeper inability to flip it to oneself that results in shit takes like that. 'is whatever that I'm creating that remains unseen - a "wasted work"? should I be stopped from creating "waste work"? is that what I would like?'
Not sure I understand your post but let me clarify.
With art, you often need to produce a looooooot of preliminary and intermediate work to achieve the final product. This isn't wasted work, as the grandparent poster seems to think.
It's similar (admittedly, not identical) in many regards with code. Each line of code stands a high chance of being revised or deleted as we iterate and get our code to a working state and then continue to improve it in the future. Those intermediate steps aren't wasted; they're how we get to where we're going.
We agree, essentially. At the risk of really labouring the point though;
It'd be most efficient for time/cost reasons if we only ship to production code which is actually used in that build.
However, intermediate steps, or additional debug/test/migration/whatever code is required to get to the final state. You probably don't want to ship debug stuff into production.
Therefore, of the total quantity of code written, somewhat less than 100% is useful in production.
Therefore, you don't want to ship 100% of all code written to production.
> "how can we make it so that 100.0% of the code I type is shipped in the final product?"
This isn't necessarily desirable; but phrased slightly differently perhaps reflects more about production efficiency:
"how can we make it so that 100% of the production code is the only code I typed?"
Well, I don't disagree with what you typed there but I think we're off in the weeds a bit.
"how can we make it so that 100% of the production
code is the only code I typed?"
We certainly want to eliminate as many unnecessary steps as possible.
But much of that intermediate work iteration is inseparable from the discovery and refinement process.
To answer your question literally, "how can we make it so that 100% of the production code is the only code I typed?" would only be possible if you moved all of that discovery and iteration and refinement out of the coding loop or whatever.
This is also what I consider to be one of the reasons why Austrian economics is silly. They call this work "malinvestment". Making mistakes is part of the process and not some unfortunate distortion of it.
I remember play a game years and years ago (cannot recall the game), you come out of a mountain cave halfway up turn left and then enter another cave. However i stopped and you got a view of a desert landscape (i remember it as awe inspiring, maybe breathtaking, but it was years ago so in reality it was probably pixelated), it looked like someone put a lot of effort into the view and wondered if anyone else stopped and looked or just kept going.
Point is, if i work on code, i refactor and refactor and try to make it perfect even though it already worked and wonder how many people working on games spent a ton of time on a scene in a game or background or sprite to get it perfect and no one pays attention. I have a friend who plays Super Mario Bros all the time just to beat it and has NEVER seen world 5, 6 or 7 because he always uses the warp zones (i am not implying those are breathtaking worlds, if they were he would be missing out).
First post I’ve seen here from Aftermath. I think the site only went live a couple days ago but seems like it’s run and owned by a bunch of veteran game journalists. I’m hoping it does well.
I wanted to argue just break the NDA and leak the art. (Anonymously, say your Dropbox was hacked, unlikely action will be taken against you.)
I wanted to argue that art leaks are great for the community, even if they harm the company. But I remember reading how the HL2 beta leak tanked morale at VALVE, and arguably if the team's morale is hurt, that ultimately hurts the community too.
Still at some point the culture there shifted and people who had been working on Episode 3 released their art, storyline etc (though some later regretted it).
Bring back cracktro's, innovative Art.
Most video game "Artwork" is a more of the same industrial product not much different from "Art" in advertisement.
Most AI generated art will also not be seen. Honestly, I don't care & i'm not the only one.
> Like yeah you can own the pictures, man, but you shouldn't be allowed to keep us from using our labor to secure future jobs.
As a developer, I wouldn't ever consider submitting proprietary code I may still have from a previous or current job when applying for another job (not least because that may leave a bad impression of how I handle confidential stuff). Why should this be different for artists?
I guess it’s like sharing a compiled binary. As long as you aren’t sharing the “source”, I.e. the vector file or the layered photoshop image file, and just a low resolution jpeg then it’s not so bad? Not saying it’s ethically okay, just that it’s not that useful for reuse by the target in the form it’s presented.
What? Sharing the binary vs. source is not the kind of line you get to draw unless you're the one paying for it to be developed.
I've worked on projects that are available on the market and I can't even point at it and suggest I had anything to do with it, or the companies involved with making it.
No, I am not being snarky. Actually artists are special: it's one of the few job where portfolios matter much more than education, and as much as (or more than) interviews.
Does it matter? Like, yeah, there's a seeming incongruity at first pass. But that doesn't mean it's important.
Games are a consumer product meant to widely available and a big part of them is visual spectacle. Some of the visuals designed in creating that product are kept secret. I see the apparent contradiction, but the lede of this article calls that "one of the greatest injustices."
Precisely. A huge amount of work in every field never sees the light of day. That doesn't mean it's not important, it just means the person doing the work is curating what goes "above the waterline" and what stays in the background as prep.
The obvious example of this same thing in literature is Tolkein, who invented whole alphabets and languages but only included tiny fragments of them in his published works (prior to his death - obviously his estate has been releasing everything remorselessly).
Or an example from a totally unrelated field, there are lots of cases in say maths where you have to do a bunch of work to choose a value but that's considered "scratch work" so gets removed when you submit the final version. So everyone sees your epsilon-delta proof of a limit for example, they see your brilliant choice of delta but the work you put into finding the delta isn't part of the proof. The proof just shows that choice of delta makes the proof work for any epsilon. Noone reading it gets to know how you figured that out, because that's not what a proof is for.
In the case of games, the artwork that isn't released is part of the worldbuilding and refining the concept for the game. It's absolutely not wasted effort even if it's never seen by anyone outside the dev team.
The meat of the article is not talking about an injustice to the consumer. It's arguing that there's an injustice to the artist because artists' portfolios are critical to their career advancement. It's not enough to say "I worked on this team." You need to literally show your personal results in detail to get your next job. But, artist employers are requiring that the majority of their results be locked in a vault forever.
Some are even pushing to claim ownership of all art made during employment. That's blatantly a move to repress the employee's negotiating power.
I don't buy the argument, there are way too many game studios for them all to be like that, some will offer friendlier terms and will become somewhat more attractive to artists, while the restrictive studios will become somewhat less attractive.
I get what you are saying. I also received a settlement check from a group of major game & movie studios who admitted to actively colluding to suppress wages.
This case isn't even explicit collusion. It's a "That's just the way it is, kid. Deal with it." implicit collusion.
Oh, I thought the article was going to be about how it's because people suck at games. But I guess these days with lets plays you don't have to earn the right to see the later levels.
This article is specifically about not being able to include some of your work in your portfolio, and this is not a problem that's unique to video game artists. It's true of all sorts of people in the entertainment industry -- writers, actors, and so forth.
But in reality it's not really a problem. Sure, you absolutely cannot take past contracted work that never saw the light of day and put it in a public portfolio without permission. Rightly, no team wants any part of the creative process leaked, except when the people at the top choose to.
But this still leaves:
1) A public portfolio of stuff that was actually released by the companies (final video game clips, movie clips, etc.)
2) A public portfolio of personal material -- projects of your own, designed to showcase your talents at their absolute best
3) A private portfolio of unreleased stuff that you can selectively and judiciously show in private interviews, when it's not competing (e.g. show your Dove soap ads to Snickers, don't show them to Dial soap)
And that generally works quite well. Especially your personal portfolio of material (#2) -- everything you work on professionally is often compromised by top-down direction. But when you create stuff that's 100% yours, it allows you to show your unique gifts in the best possible light.
It is a problem that (3) is something that could get you sued, depending on the specifics, although I'll grant that almost never happens.
I have work for major big brands that I suspect would land me more clients but I can't even mention the relationship. I signed the NDA, so it wasn't a surprise, but it is frustrating that the power imbalance allows that.
> HN skews toward believing that if you can be sued, you will be.
Maybe it's more like, it doesn't matter how correct you are unless you can afford a good lawyer. The infinite monkey theorem is not applicable. Lawyers who make the bulk of their money from being on the suing side (or not necessarily suing, just being hired to be able to sue) rather than the defending side are not monkeys randomly pressing keys on their keyboards.
Then again, not everyone's threat model for "getting sued" is the same.
There's a similar problem in the software dev world. It's very desirable to provide examples of your past work, but if that work was done for an employer, then you usually can't show it to anyone.
Lots of devs have hobby projects in part to generate work that they own and therefore can show to others.
A similar realization had a major impact on how Blizzard approached end-game raiding in WoW. In the first 2 expansions, only the most dedicated players were seeing the final boss and culmination of the storylines. It took 40 players working together to get there, and those 40 players had to execute complex fights in order to reach the end. Though a very rewarding experience for those who could do it, Blizzard did the math and realized they were excluding the VAST majority of their playerbase from the coolest content. They were spending tons of money creating this content and no one was experiencing it!
In the 3rd expansion and ever since then, end game content has been tuned to be a lot more forgiving and to require less people. There are still complex and rewarding fights, but ramping up the difficulty is "opt-in" generally speaking.
This is likely one of the reasons the game remained so successful for so long. Prior to this mentality shift, it was very common for end game MMO content to only be seen by a small minority of the players.
A similar thing happened with Star Wars Galaxies. Part of the original pitch was that like <1% of players would have access to the Force. Only from there with super hard training could they be Jedi. This matches the lore of the universe, Jedi (especially in the OT era) are the most rare creatures.
This was accurate, and would make seeing a Jedi in the game world the coolest thing imaginable, but it was a disaster for the finances of running the game.
Obviously, EVERYONE wants to be a Jedi. So they re-tooled it so that anyone could. Eventually the whole game was replaced with another MMO that was set in the ancient ages when they were common.
1 year ago this article would have been much more relevant. With the advent of image generation via AI, artwork is becoming more commoditized and this is becoming less relevant.
It's sad to see this happen in a year but this is literally the first thing I thought about when I read the headline.
A tangent but reminds me of the fact that even though we have millions of photographers with amazing cameras, most news and culture sites display images in low quality.
So most great photos will also never been seen. I remember Boston Globe from years ago with great photo content that really captured the vibe of so many places and people in the news.
These days news means boring propaganda PR pictures or tiny images with artifacts, very weird. It's like we're not as close to the actual world as we were 10+ years ago.
Probably also about licensing, but it's stupid because most photos won't end up as "pressphoto of the year" or whatever they are saving their HQ versions for.
This bothers me a lot. It reminds me of old movies and tv shows using garbage low quality props that eventually look like monopoly money instead of dollars at current resolutions, because the quality of broadcast and even projection at the time of filming was so low nobody would be able to tell. future proof your work for gods sake
This article feels like the art equivalent of engineers saying the code they write in their jobs will never be open sourced. I mean, yeah, that's somewhat unfortunate, but not all businesses can survive operating that way, and you kind of know what you signed up for when you went in. It seems like the artists interviewed in this article know that, but would like to see some reasonable limits placed on it, like a statute of limitations.
I can relate. Once, say, 10 years have passed, the only people who care about that code I wrote for a job 10 years ago would probably be me, and I'd like to be able to do with it what I want.
> While some of the commercial reasons for keeping game art under wraps make sense, many artists working in the video game industry say they're subject to a power imbalance, even in full-time studio positions, that sees the bulk of their work locked away in vaults, where not only can fans never see them but where artists can't share them either, not even in professional settings like job applications or portfolios.
This is actually a pretty big problem. My cofounder and I run a small video game studio and we worked with our lawyers to figure out how to protect artists as well as the company in a way that seemed fair. What we came up with was a general blanket rule that if an artist's work gets used in any public way (promo materials, game launch) then they can use that work immediately for their public portfolios. In the situation where it's not released we have end dates in the contracts for when they can use their work in their portfolios.
> If an artist's work gets used in any public way (promo materials, game launch) then they can use that work immediately for their public portfolios
Do you see this case being prohibited as the norm for other studios? (the cited section only talks about art "locked away in vaults")
> In the situation where it's not released we have end dates in the contracts for when they can use their work in their portfolios.
That's actually quite charitable, but IMO understandable how difficult this is for larger companies. After all your company paid for that product of work, which also reveals parts of your creative process.
Imagine Ford paying someone to make five car-designs, ending up using one of them and ultimately losing full control over the remaining four...
>Imagine Ford paying someone to make five car-designs, ending up using one of them, and ultimately losing full control over the remaining four...
Which is how things used to work. The Isuzu Impulse was originally a design for Audi and then BMW by Giugiaro which was turned down, and Giugiaro then reworked it a bit when Isuzu came asking for a new design. Originally the Cizeta V16T was a Lamborghini Countach replacement, before Lamborghini turned it down and Moroder bought the rights to it. The Chrysler minivans of much fame were originally a Ford design from 1972 called the Ford Carousel that Henry Ford II told Lee Iaccoca to buzz off with... Which he did after a minor rhinoplasty to the thing. And they were a smash hit that was half of the equation that saved Chrysler in 1983 and left Ford almost dead.
> Imagine Ford paying someone to make five car-designs, ending up using one of them and ultimately losing full control over the remaining four...
Yeah this isn't a valid comparison in the slightest. Artists using work for their public portfolio doesn't mean that they can use the work for other employers or in other contexts. Not sure how you made such large leap here.
reply