Graf is probably better than ever now, shame it's not as long lasting was in Rome. Hopefully some of the best artists pick up sculpting so people in the future might have a little taste.
Digital pictures of their work last indefinitely, just saying
Edit: To everyone mentioning that digital photos aren’t guaranteed to last forever, yup: That’s what indefinitely kinda means. There is no guarantee. Could last another day because the medium the only copy was stored on failed, or it could last for thousands of years because it was properly backed up in a lossless format.
I feel like there have to be real text exchanges similar to this, but I always assume they’re fake, because they’re just too easy to fabricate. I wonder how many of them were actually real and I just didn’t believe it. Would’ve made them way more funny (not saying it’s not funny if they’re not, but still)
blocked all big piracy communities, is trying to move to discord for all announcements.
Too add my personal opinion like why the fuck, they gain nothing from the discord thing and the piracy thing has been proven pretty safe from email servers operating similarly
Now it makes sense. The dream of universal access to knowledge was actually the iphone’s - and it was because the phone was dying, and seeing death visions, like life flashing before it’s eyes.
40? I remember when they were 20. Hell, I remember when you could get slightly older titles for 10. I used to go to Egghead and buy slightly older games with my allowance.
Optical discs are dirt cheap. This old answer from Quora says physical media (disc, case, artwork, inserts, etc) accounted for $2-$5 of the cost of a game.
Yes, if you’re selling millions of units. But if you’re buying just one, $2-$5 probably isn’t going to matter to you. Not many people would buy a game at $68 they wouldn’t buy at $70.
If you add all the season passes you’re paying the same or even more with further microtransactions
Games in general now have a longer shelf life
AAA games in my country have been 69,99€ since the PS3 launch and now they’re asking 79,99€. It’s true development costs have ballooned, but I just don’t think that’s a good price/time ratio and rarely do I buy games over 15€. I really don’t mind waiting a couple years.
You can buy musical instruments for that price software or hardware synthesisers, for example.
But that’s exactly the point, I’d rather pay double, triple, quadruple for something I know I’ll use for hundreds of hours (a monitor, a new keyboard, a Steam Deck) than 80€ for a game that will last me 12 to 30 hours (I only play offline story-based games).
Even if I considered game X, there are decades worth of games availabe for under 10€ that I would rather get now or buy a Humble Bundle while waiting for a sale.
The issue becomes of all publishers start to follow Nintendo’s model and not dropping the prices much.
TofK could be the best game ever made (and I don’t think it’s too far fetched given how good it is) and I still wouldn’t justify anything bigger than 50€, 60€ being generous.
I dunno. Baldurs Gate 3 has a truly unbelievable amount of content in it. $70 for it is almost unfair when you consider how far $70 gets you in almost any other hobby.
Someone told me something similar about Tears of the Kingdom and my answer is the same: BG3 could be the greatest game ever made with content from here to eternity, but 70$ is still too much for a game. Specially considering who ends up benefitting the most from the sales.
That makes zero sense. Explain why BG3 is not worth $70. Give me real data showing that. How much should it cost considering how many people worked on it and how much was spent developing it?
It takes 75 - 100 hours to beat the game, and that’s just one play through and that one play through can take even longer depending on play style. This is the kind of game people can get several hundred or thousands of hours out of. Show me any other hobby where you can spend $70 one time and get hundreds of hours of enjoyment.
Hell, even if you sped through the game as fast as possible and spent 50 hours (made up number, not sure what a speedy play through takes), that’s still a LOT of time for the money spent. Take an uber out to a movie with friends, then go to a restaurant, then uber back home and you’ll have bought at least two copies of BG3, yet you got a few hours of entertainment.
There are next to no other forms of entertainment that give give you that many hours for your money.
I have devoted that amount of hours or even more to some games and still think the 40-50€ that costed me each one of them when I bought them is too much.
Entertainment shouldn’t be that expensive. Period.
I don’t agree. Development costs money and I’m willing to pay for it. I usually compare it to other daily things, such as nice restaurant visits or such. Things costs money.
Just because I’m curious, what would you feel to be a fair price for one of those games?
Except most of the revenues from the sales of the games don’t go to those who actually develop the games. We all know gamedevs aren’t paid enough and sometimes do a lot of crunch, specially in big studios. We can’t ignore that fact.
Imo I could excuse a maximum of 50€ (or dollars in this particular case), and the ideal would be something between 30 and 40.
Depends on the studio of course, but I bet in the general case they wouldn’t be payed more if the price was lowered. It’d be fun to investigate the margins but I don’t care enough to do so.
The games I play the most are actually from reputable studios and/or indie devs whom I don’t mind supporting. Except football manager, but I don’t buy new revisions and have clocked enough hours to feel ok with the price.
There is a reason for it, isn’t there? Bullshit is motivated to manipulate, and spread propaganda. While, truth based journalism needs professionals to do due diligence. While we can argue for better journalism, wishing for everything to be free ain’t gonna work.
Unless we are okay with… Ads. We won’t tolerate that either, would we?
I personally never rebase. It always seems to have some problem. I’m surely there’s a place and time for rebasing but I’ve never seen it in action I guess.
If your cherry-pick doesn't run into conflicts why would your merge? You don't need to merge to master until you're done but you should merge from master to your feature branch regularly to keep it updated.
(I’m also a fan of rebasing; but I also like to land commits that perform a logical and separable chunk of work, because I like history to have decent narrative flow.)
That is absolutely not what rebasing does. Rebasing rewrites the commit history, cherry picking commits then doing a normal merge does not rewrite any history.
I’m sorry but that’s incorrect. “Rewriting the commit history” is not possible in git, since commits are immutable. What rebase actually does is reapply each commit between upstream and head on top of upstream, and then reset the current branch to the last commit applied (This is by default, assuming no interactive rebase and other advanced uses). But don’t take my word for it, just read the manual. git-scm.com/docs/git-rebase
“Reapply” is rewriting it on the other branch. The branch you are rebasing to now has a one or multiple commits that do not represent real history. Only the very last commit on the branch is actually what the user rebasing has on their computer.
My biggest issue with GitHub is that it always squashes and merges. It’s really annoying as it not only takes away from commit history, but it also puts the fork out of sync with the main branch, and I’ll often realize this after having implemented another features, forcing me end up cherry picking just to fix it. Luckily LazyGit makes this process pretty painless, but still.
Seriously people, use FF-merge where you can.
Then again, if my feature branch has simply gone behind upstream, I usually pull and rebase. If you’ve got good commits, it’s a really simple process and saves me a lot of future headaches.
There’s obviously places not to use rebase(like when multiple people are working on a branch), but I consider it good practice to always rebase before merge. This way, we can always just FF-merge and avoid screwing with the Git history. We do this at my company and honestly, as long as you follow good practices, it should never really get too out of hand.
Sounds like I just gotta get better with rebasing. But generally I do my merges clean from local changes. I’ll commit and push, then merge in, push. Then keep working. Not too hard to track but I’ve found it’s the diff at MR time that people really pay attention to. So individual commits haven’t been too crucial.
Yeah, I am. However GitHub, being the biggest Git hosting provider and all that, makes you use merge commits. FF-merges must be done manually from the command line. While this definitely isn’t a problem for me, many people out there just don’t care and merge without a second thought (which, as I said in my comment, results in having to create a new branch and cherry picking the commits onto there).
Always merge when you’re not sure. Rebasing rewrites your commit history, and merging with the squash flag discards history. In either case, you will not have a real log of what happened during development.
Why do you want that? Because it allows you to go back in time and search. For example, you could be looking for the exact commit that created a specific issue using git bisect. Rebasing all the commits in a feature branch makes it impossible to be sure they will even work, since they represent snapshots that never existed.
I’ll never understand why people suggest you should default to rebasing. When prompted about why, it’s usually some story about how it went wrong and it was just easier to do it the wrong way.
I’m not saying never squash or rebase. It depends on the situation but if you had to pick a default, it should be to simply merge.
I try to structure my commits in a way that minimizes their blast radius, which usually likes trying to reduce the number of files In touch per commit.
For example, my commit history would look like this:
Add new method to service class
Use new service class method in worker
And then as I continue working, all changes will be git commit --fixuped to one of those two commit’s hashes depending on where they occur.
And when it’s time to rebase in full, I can do a git rebase master --interactive --autosquash.
I’ve always merged. Rebase simplifies the history graph, but realistically I can’t think of a time where that has been important to me, or any of the teams I’ve worked with.
Maybe on some projects with a huge number of concurrent branches it becomes more important, probably less so for smaller teams.
I’ve never been to a rock concert. I’ve been to plenty of shows in clubs, but never a concert. It always felt like it wouldn’t be worth it unless I paid a ton of money for close up seats. And then I’d go deaf.
lemmy.world
Active