https://youtu.be/1ih5BxnJu2I?si=CPfQdtit5aVVBDOR
Around 20 mins, near ghost tools.
Yes. There’s this talk and another on melee balancing and Hp inflation specifically. Both are really great talks.
https://youtu.be/1ih5BxnJu2I?si=CPfQdtit5aVVBDOR
Around 20 mins, near ghost tools.
Yes. There’s this talk and another on melee balancing and Hp inflation specifically. Both are really great talks.
I’m an expert in game design and economy design (+10 yr experience professionally).
You do this so that health doesn’t feel rare. The same thing with ammo. If you don’t drop ammo for weapons, even when the player is full, the player may believe ammo is rare, hoard it, and not shoot. So if you want to incent players taking risks, you drop health and ammo, even at full, so the player feels they can experiment.
This was noted in the GDC talk for Ghost of Tsushima: they do step on the drop rates when you’re low to give more than usual, but they don’t do the reverse (e.g. give you none at full) because they found, in play testing, players hoarding ghost tools (and therefore didn’t use them) unless the player believed a bunch was available.
Starting off with “we’ve heard your feedback” is something I’ve never heard from an abusive parent?
David Bowie and Prince both bent and blurred gender lines while still being attractive, unique, and amazingly talented. Bowie died really close to his birthday, and both dates are close to my birthday.
When he died, I decided to check off some of my bucket list items, like performing in drag. Whenever I’ve felt self conscious, thinking about these icons really helped me be comfortable with myself and my journey.
I really miss both of them as a fan. :/ I wish I had seen them live.
My ADHD tax is currently $377 every month for Vyvanse.
About to be a lot of “accidental” falls out of windows.
I believe in UBI, but the Captain Laserhawk show made me aware of how much it could get twisted in fucked up ways. “Don’t watch this show? -$100 from your stipend this month.” I used to think things like that were fear mongering, but the world is all kinds of weird today.
Maybe more apt for me would be, “We don’t need to teach math, because we have calculators.” Like…yeah, maybe a lot of people won’t need the vast amount of domain knowledge that exists in programming, but all this stuff originates from human knowledge. If it breaks, what do you do then?
I think someone else in the thread said good programming is about the architecture (maintainable, scalable, robust, secure). Many LLMs are legit black boxes, and it takes humans to understand what’s coming out, why, is it valid.
Even if we have a fancy calculator doing things, there still needs to be people who do math and can check. I’ve worked more with analytics than LLMs, and more times than I can count, the data was bad. You have to validate before everything else, otherwise garbage in, garbage out.
It’s sounds like a poignant quote, but it also feels superficial. Like, something a smart person would say to a crowd to make them say, “Ahh!” but also doesn’t hold water long.
I generally agree. It’ll be interesting what happens with models, the datasets behind them (particularly copyright claims), and more localized AI models. There have been tasks where AI greatly helped and sped me up, particularly around quick python scripts to solve a rote problem, along with early / rough documentation.
However, using this output as justification to shed head count is questionable for me because of the further business impacts (succession planning, tribal knowledge, human discussion around creative efforts).
If someone is laying people off specifically to gap fill with AI, they are missing the forest for the trees. Morale impacts whether people want to work somewhere, and I’ve been fortunate enough to enjoy the company of 95% of the people I’ve worked alongside. If our company shed major head count in favor of AI, I would probably have one foot in and one foot out.
This has been my general worry: the tech is not good enough, but it looks convincing to people with no time. People don’t understand you need at least an expert to process the output, and likely a pretty smart person for the inputs. It’s “trust but verify”, like working with a really smart parrot.
Yeah, this phrase makes way more sense within the context of a game or game theory. For me, it goes back to fighting games or sports. People play to win in those settings. The rules are heavily defined, and the players must abide. These other examples are people misusing the phrase.
It’s not as much. GaaS is the predominant model, and you make more on the LiveOps side than the launch recoup period.
Source: Developer of 10 years, x-Director at 200 person company.
You look on at the festive dish that’s seemingly grown consciousness. Others impatiently wait behind you, expecting you to dig in.
There was a similar study reported the other day about using FMRI imagining and AI to recreate the “thought content” of someone’s brain. It required training for the AI in the person’s brain and some other training. It does seem these techniques can work with some specified models, but yeah, it doesn’t seem like hooking someone’s brain up to this would create a movie of their mind or something.
I think the more dangerous part is “This is step 0,” which this tech would have seemed impossible 10 years ago. Very strange times.
Easy back for me. The original RoA is one of my favorite platform fighters. I’m happy to support Dan and crew for their next venture. I can’t wait till beta opens. :)
Game designer.
I’m a Director of Game Design now.
What is Mark has been a sentient AI for some time?