There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

programmer_humor

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

RustyNova , in Surely "1337" is the same as 1337, right?

To whoever does that, I hope that there is a special place in hell where they force you to do type safe API bindings for a JSON API, and every time you use the wrong type for a value, they cave your skull in.

Sincerely, a frustrated Rust dev

skuzz ,

“Hey, it appears to be int most of the time except that one time it has letters.”

throws keyboard in trash

Username ,

Rust has perfectly fine tools to deal with such issues, namely enums. Of course that cascades through every bit of related code and is a major pain.

RustyNova ,

Sadly it doesn’t fix the bad documentation problem. I often don’t care that a field is special and either give a string or number. This is fine.

What is not fine, and which should sentence you to eternal punishment, is to not clearly document it.

Don’t you love when you publish a crate, have tested it on thousands of returned objects, only for the first issue be “field is sometimes null/other type?”. You really start questioning everything about the API, and sometimes you’d rather parse it as serde::Value and call it a day.

skuzz ,

API is sitting there cackling like a mad scientist in a lightning storm.

skuzz ,

True, and also true.

bleistift2 OP ,

This man has interacted with SAP.

Rednax ,

The worst thing is: you can’t even put an int in a json file. Only doubles. For most people that is fine, since a double can function as a 32 bit int. But not when you are using 64 bit identifiers or timestamps.

firelizzard ,
@firelizzard@programming.dev avatar

That’s an artifact of JavaScript, not JSON. The JSON spec states that numbers are a sequence of digits with up to one decimal point. Implementations are not obligated to decode numbers as floating point. Go will happily decode into a 64-bit int, or into an arbitrary precision number.

Aux ,

What that means is that you cannot rely on numbers in JSON. Just use strings.

JackbyDev ,

Unless you’re dealing with some insanely flexible schema, you should be able to know what kind of number (int, double, and so on) a field should contain when deserializing a number field in JSON. Using a string does not provide any benefits here unless there’s some big in your deserialzation process.

Aux ,

What’s the point of your schema if the receiving end is JavaScript, for example? You can convert a string to BigNumber, but you’ll get wrong data if you’re sending a number.

JackbyDev ,

I’m not following your point so I think I might be misunderstanding it. If the types of numbers you want to express are literally incapable of being expressed using JSON numbers then yes, you should absolutely use string (or maybe even an object of multiple fields).

sukhmel , (edited )

The point is that everything is expressable as JSON numbers, it’s when those numbers are read by JS there’s an issue

JackbyDev ,

Can you give a specific example? Might help me understand your point.

sukhmel ,

I am not sure what could be the example, my point was that the spec and the RFC are very abstract and never mention any limitations on the number content. Of course the implementations in the language will be more limited than that, and if limitations are different, it will create dissimilar experience for the user, like this: Why does JSON.parse corrupt large numbers and how to solve this

JackbyDev ,

This is what I was getting at here programming.dev/comment/10849419 (although I had a typo and said big instead of bug). The problem is with the parser in those circumstances, not the serialization format or language.

sukhmel ,

I disagree a bit in that the schema often doesn’t specify limits and operates in JSON standard’s terms, it will say that you should get/send a number, but will not usually say at what point will it break.

This is the opposite of what C language does, being so specific that it is not even turing complete (in a theoretical sense, it is practically)

JackbyDev , (edited )

Then the problem is the schema being under specified. Take the classic pet store example. It says that the I’d is int64. petstore3.swagger.io/#/store/placeOrder

If some API is so underspecified that it just says “number” then I’d say the schema is wrong. If your JSON parser has no way of passing numbers as arbitrary length number types (like BigDecimal in Java) then that’s a problem with your parser.

I don’t think the truly truly extreme edge case of things like C not technically being able to simulate a truly infinite tape in a Turing machine is the sort of thing we need to worry about. I’m sure if the JSON object you’re parsing is some astronomically large series of nested objects that specifications might begin to fall apart too (things like the maximum amount of memory any specific processor can have being a finite amount), but that doesn’t mean the format is wrong.

And simply choosing to “use string instead” won’t solve any of these crazy hypotheticals.

sukhmel ,

Underspecified schema is indeed a problem, but I find it too common to just shrug it off

Also, you’re very right that just using strings will not improve the situation 🤝

bleistift2 OP ,

What makes you think so?


<span style="color:#323232;">const bigJSON = '{"gross_gdp": 12345678901234567890}';
</span><span style="color:#323232;">JSON.parse(bigJSON, (key, value, context) => {
</span><span style="color:#323232;">  if (key === "gross_gdp") {
</span><span style="color:#323232;">    // Ignore the value because it has already lost precision
</span><span style="color:#323232;">    return BigInt(context.source);
</span><span style="color:#323232;">  }
</span><span style="color:#323232;">  return value;
</span><span style="color:#323232;">});
</span><span style="color:#323232;">> {gross_gdp: 12345678901234567890n}
</span>

developer.mozilla.org/en-US/docs/Web/…/parse

Aux ,

Because no one is using JSON.parse directly. Do you guys even code?

bleistift2 OP ,

It’s neither JSON’s nor JavaScript’s fault that you don’t want to make a simple function call to properly deserialize the data.

Aux ,

It’s not up to me. Or you.

Carighan ,
@Carighan@lemmy.world avatar

Relax, it’s just JSON. If you wanted to not be stringly-typed, you’d have not used JSON.

(though to be fair, I hate it when people do bullshit types, but they got a point in that you ought to not use JSON in the first place if it matters)

RustyNova ,

As if I had a choice. Most of the time I’m only on the receiving end, not the sending end. I can’t just magically use something else when that something else doesn’t exist.

Heck, even when I’m on the sending end, I’d use JSON. Just not bullshit ones. It’s not complicated to only have static types, or having discriminant fields

Mubelotix ,
@Mubelotix@jlai.lu avatar

You HAVE to. I am a Rust dev too and I’m telling you, if you don’t convert numbers to strings in json, browsers are going to overflow them and you will have incomprehensible bugs. Json can only be trusted when serde is used on both ends

RustyNova ,

This is understandable in that use case. But it’s not everyday that you deal with values in the range of overflows. So I mostly assumed this is fine in that use case.

Aux ,

Well, apart from float numbers and booleans, all other types can only be represented by a string in JSON. Date with timezone? String. BigNumber/Decimal? String. Enum? String. Everything is a string in JSON, so why bother?

RustyNova ,

I got nothing against other types. Just numbers/misleading types.

Although, enum variants shall have a label field for identification if they aren’t automatically inferable.

Aux ,

Well, the issue is that JSON is based on JS types, but other languages can interpret the values in different ways. For example, Rust can interpret a number as a 64 bit int, but JS will always interpret a number as a double. So you cannot rely on numbers to represent data correctly between systems you don’t control or systems written in different languages.

sukhmel , (edited )

No problem with strings in JSON, until some smart developer you get JSONs from decides to interchangeably use String and number, and maybe a boolean (but only false) to show that the value is not set, and of course null for a missing value that was supposed to be optional all along but go figure that it was

worldwidewave , in Site: "I don't feel so good...."

Crossfade that tag with another HTML element calling the clients cheap, which gets darker every day.

JackbyDev , in Trying to understand JSON…

Just what every programming language needs, not one, but two types of null! Because nobody ever said one type was difficult enough.

If I see any of you make this distinction matter for anything other than “PUT vs. PATCH” semantics I’m going to be very angry.

bleistift2 OP ,

I do this constantly. undefined: not retrieved yet. null: Error when retrieving. Makes it easy to reason about what the current state of the data is without the need for additional status flags.

MostlyBlindGamer , in Trying to understand JSON…
@MostlyBlindGamer@rblind.com avatar

Thanks for the transcription!

Surely Java can tell the difference between a key with a null value and the absence of that key, no?

I mean, you can set up your deserialization to handle nulls in different ways, but a string to object dictionary would capture this, right?

bleistift2 OP ,

Sure, Java can tell the difference. But that doesn’t mean that the guy writing the API cares whether or not he adds a key to the dictionary before yeeting it to the client.

agressivelyPassive ,

It can, but especially during serialization Java sometimes adds null references to null values.

That’s usually a mistake by the API designer and/or Java dev, but happens pretty often.

MostlyBlindGamer ,
@MostlyBlindGamer@rblind.com avatar

That’s the thing though, isn’t it? The devs on either side are entering into a contract (the API) that addresses this issue, even if by omission. Whoever breaks the contract must rightfully be ejected into the stratosphere.

agressivelyPassive ,

That’s exactly not the thing, because nobody broke the contract, they simply interpret it differently in details.

Having a null reference is perfectly valid json, as long as it’s not explicitly prohibited. Null just says “nothing in here” and that’s exactly what an omission also communicates.

The difference is just whether you treat implicit and explicit non-existence differently. And neither interpretation is wrong per contract.

MostlyBlindGamer ,
@MostlyBlindGamer@rblind.com avatar

I think we’re fully in agreement here: if the API doesn’t specify how to handle null values, that omission means they’re perfectly valid and expected.

Imagine a delivery company’s van exploding if somebody attempts to ship an empty box. That would be a very poorly built van.

masterspace ,

Null means I’m telling you it’s null.

Omission means it’s not there and I’m not telling you anything about it.

There is a world of difference between those two statements. It’s the difference between telling someone you’re single or just sitting there and saying nothing.

agressivelyPassive ,

Nope.

If there’s a clear definition that there can be something, implicit and explicit omission are equivalent. And that’s exactly the case we’re talking about here.

masterspace , (edited )

Sure, in a specific scenario where you decide they’re equivalent they are, congratulations. They’re not generally.

agressivelyPassive ,

Did you read the comments above?

You can’t just ignore context and proclaim some universal truth, which just happens to be your opinion.

docAvid ,

At the (SQL) database level, if you are using null in any sane way, it means “this value exists but is unknown”. Conflating that with “this value does not exist” is very dangerous. JavaScript, the closest thing there is to a reference implementation for json serialization, drops attributes set to undefined, but preserves null. You seem to be insisting that null only means “explicit omission”, but that isn’t the case. Null means a variety of subtly different things in different contexts. It’s perfectly fine to explicitly define null and missing as equivalent in any given protocol, but assuming it is not.

agressivelyPassive ,

Again, did you actually read the comments?

Is SQL an API contract using JSON? I hardly think so.

Java does not distinguish between null and non-existence within an API contract. Neither does Python. JS is the weird one here for having two different identifiers.

Why are you so hellbent on proving something universal that doesn’t apply for the case specified above? Seriously, you’re the “well, ackshually” meme in person. You are unable or unwilling to distinguish between abstract and concrete. And that makes you pretty bad engineers.

docAvid ,

If your SQL model has nulls, and you don’t have some clear way to conserve them throughout the data chain, including to the json schema in your API contract, you have a bug. That way to preserve them doesn’t have to be keeping nulls distinct from missing values in the json schema, but it’s certainly the most straightforward way.

The world has more than three languages, and the way Java and Python do things is not universally correct. I’m not up to date on either of them, but I’m also guessing that they both have multiple libraries for (de) serialization and for API contract validation, so I am not really convinced your claims are universal even within those languages.

I am not the other person you were talking to, I’ve only made one comment on this, so not really “hellbent”, friend.

Yes, I am pretty sure I read the comments, although you’re making me wonder if I’m missing one. What specific comment, what “case specified above” are you referring to? As far as I can see, you are the one trying to say that if a distinction between null and a non-existent attribute is not specified, it should universally be assumed to be meaningless and fine to drop null values. I don’t see any context that changes that. If you can point it out, specifically, I’ll be glad to reassess.

Saizaku ,

At the (SQL) database level, if you are using null in any sane way, it means “this value exists but is unknown”.

Null at the SQL means that the value isn’t there, idk where you’re getting that from. SQL doesn’t have anything like JS’s undefined, there’s no other way to represent a missing value in sql other than null (you could technically decide on certain values for certain types, like an empty string, but that’s not something SQL defines).

JackbyDev ,

I (think, at least) the point they’re making is that unless the API contract specifically differentiates between “present and null” and “absent” then there is no difference. (Specifically for field values.)

masterspace ,

The point I’m making is kind of the opposite, unless the contract explicitly states that they’re the same they should not be treated as the same, because at a fundamental level they are not the same thing even if Java wants to treat them as such.

Lysergid ,

Kinda, I guess we all can agree it’s more typical to deserialize into POJO where theres is no such thing as missing field. Otherwise why would you choose Java if you don’t use types. This great precondition for various stupid hacks to achieve „patching” resources, like blank strings or negative numbers for positive-only fields or even Optional as a field.

NigelFrobisher ,

You can always bind the JSON to a hashmap implementation, as that’s all JSON is anyway. It’s not pretty but it works.

sukhmel , in "No way to prevent this" say users of only language where this regularly happens - 07/01/2024

Also, I like how this problem had a really simple solution all along

There really isn’t anything we can do to prevent memory safety vulnerabilities from happening if the programmer doesn’t want to write their code in a robust manner.

Yeah, totally, it’s all those faulty programmers fault. They should’ve written good programmes instead of the bad ones, but they just refuse to listen

onlinepersona OP , (edited )

Right, those devs with 20+ years C experience don’t know shit about the language and are just lazy. They don’t want to catch up with the times and write safe C. It’s me, the dude with 5 years of university experience who will set it straight. Look at my hello world program, not a single line of vulnerable code.

Anti Commercial-AI license

sukhmel ,

This is not completely wrong, though

onlinepersona OP , (edited )

Yeah, for sure. Human error is involved in C and inertia too. New coding practices and libraries aren’t used, tests aren’t written, code quality sucks (variable names in C are notoriously cryptic), there’s little documentation, many things are rewritten (seems like everybody has rewritten memory allocation at least once), one’s casual void * is another’s absolute nono, and so on.

C just makes it really easy to make mistakes.

Anti Commercial-AI license

Corbin ,

It has nothing to do with knowing the language and everything to do with what’s outside of the language. C hasn’t resembled CPUs for decades and can’t be reasonably retrofitted for safety.

asyncrosaurus ,

Well yeah, 100% of programming errors are programmers fault.

brian , in Surely "1337" is the same as 1337, right?

json doesn’t have ints, it has Numbers, which are ieee754 floats. if you want to precisely store the full range of a 64 bit int (anything larger than 2^53 -1) then string is indeed the correct type

bleistift2 ,

Let me show you what Ethan has to say about this: feddit.org/post/319546/174361

bleistift2 OP ,

json doesn’t have ints, it has Numbers, which are ieee754 floats.

No. numbers in JSON have arbitrary precision. The standard only specifies that implementations may impose restrictions on the allowed values.

This specification allows implementations to set limits on the range and precision of numbers accepted. Since software that implements IEEE 754 binary64 (double precision) numbers [IEEE754] is generally available and widely used, good interoperability can be achieved by implementations that expect no more precision or range than these provide, in the sense that implementations will approximate JSON numbers within the expected precision. A JSON number such as 1E400 or 3.141592653589793238462643383279 may indicate potential interoperability problems, since it suggests that the software that created it expects receiving software to have greater capabilities for numeric magnitude and precision than is widely available.

www.rfc-editor.org/rfc/rfc8259.html#section-6

veganpizza69 , in Surely "1337" is the same as 1337, right?
@veganpizza69@lemmy.world avatar

It’s the API’s job to validate it either way. As it does that job, it may as well parse the string as an integer.

JordanZ ,

deleted_by_author

  • Loading...
  • bleistift2 OP ,

    Or even funnier: It gets parsed in octal, which does yield a valid zip code. Good luck finding that.

    xthexder ,
    @xthexder@l.sw0.com avatar

    Well shit, my zip code starts with a 9.

    bleistift2 OP ,

    I’m not sure if you’re getting it, so I’ll explain just in case.

    In computer science a few conventions have emerged on how numbers should be interpreted, depending on how they start:

    • decimal (the usual system with digits from 0 to 9): no prefix
    • binary (digits 0 and 1): prefix 0b, so 0b1001110
    • octal (digits 0 through 7): prefix 0, so 0116
    • hexadecimal (digits 0 through 9 and then A through E): prefix 0x, so 0x8E

    If your zip code starts with 9, it won’t be interpreted as octal. You’re fine.

    xthexder ,
    @xthexder@l.sw0.com avatar

    Well, you’re right. I wasn’t getting it, but I’ve also never seen any piece of software that would treat a single leading zero as octal. That’s just a recipe for disaster, and it should use 0o116 to be unambiguous

    (I am a software engineer, but was assuming you meant it was hardcoded to parse as octal, not some weird auto-detect)

    docAvid ,

    It’s been a long time, but I’m pretty sure C treats a leading zero as octal in source code. PHP and Node definitely do. Yes, it’s a bad convention. It’s much worse if that’s being done by a runtime function that parses user input, though. I’m pretty sure I’ve seen that somewhere in the past, but no idea where. Doesn’t seem likely to be common.

    bleistift2 OP ,

    PHP and Node definitely do.

    Node doesn’t.

    
    <span style="color:#323232;">> parseInt('077')
    </span><span style="color:#323232;">77
    </span>
    
    1. If the input string, with leading whitespace and possible +/- signs removed, begins with 0x or 0X (a zero, followed by lowercase or uppercase X), radix is assumed to be 16 and the rest of the string is parsed as a hexadecimal number.
    2. If the input string begins with any other value, the radix is 10 (decimal).

    developer.mozilla.org/en-US/docs/Web/…/parseInt

    docAvid ,

    You seem to have missed the important phrase “in source code”, as well as the entire second part of my comment discussing that runtime functions that parse user input are different.

    bleistift2 OP ,

    You seem to have missed the important phrase “in source code”

    I read that, but I thought it was a useless qualifier, because everything is source code. You probably meant “in a literal”.

    bleistift2 OP ,

    I’ve also never seen any piece of software that would treat a single leading zero as octal

    I thought JavaScript did that, but it turns out it doesn’t. I thought Java did that, but it turns out it doesn’t. Python did it until version 2.7: docs.python.org/2.7/library/functions.html#int. C still does it: en.cppreference.com/w/c/string/byte/strtol

    xthexder ,
    @xthexder@l.sw0.com avatar

    Interesting that strtol in C does that. I’ve always explicitly passed in base 10 or 16, but I didn’t know it would auto-detect if you passed 0. TIL.

    kamen ,

    Oof.

    I guess this is one of the reasons that some linters now scream if you don’t provide base when parsing numbers. But then again good luck finding it if it happens internally. Still, I feel like a ZIP should be treated as a string even if it looks like a number.

    bitfucker ,

    Yep. Much like we don’t treat phone numbers like a number. The rule of thumb is that if you don’t do any arithmetic with it, it is not a “number” but numeric.

    sukhmel , (edited )

    Well, we don’t, but every electonic tables software out in the wild on the other hand…

    /jYes, I know that you can force it to become text by prepending to the phone, choose an appropriate format for the cells, etc, etc The point is that this often requires meddling after the phone gets displayed as something like 3e10

    raman_klogius ,

    Who tf decided that a 0 prefix means base 8 in the first place? If a time machine was invented somehow I’m going to cap that man, after the guy that created JavaScript.

    JackbyDev ,

    Should be like 0o777 to mimic hex 0xFF

    bleistift2 OP ,

    I refuse to validate data that comes from the backend I specifically develop against.

    Thcdenton ,
    mariusafa , in Please stop

    What’s wrong with having a some year old software? Does it do what you need? Yes. Then what? I have all I need on Debian. Why should I care of new updates. Security? Yes we have Debian security because of that. Look, y’all had the xyz backdoor package in your systems because it was new. Me as a Debian stable user I didn’t have to deal with it. Did I lose something by not having the latests software? No. Well maybe less crashes.

    Most privative software also gets weekly updates. Does it make it better? No. You may prefer that.

    Also I don’t get the point about the version numbering of Debian packages. Every team uses the versioning they want.

    From my experience software that updates a lot tends to break old features a lot too.

    Debian suporting freesoftware projects or other stuff doesn’t look as a relevant argument. I mean if you prefer using privative stuff and using that kind of software. Do whatever you like with your Google/Facebook/Apple friends.

    But don’t come intoxicate the community with this bullshit.

    bruh ,

    Does it do what you need?

    No

    phoenixz ,

    Cool, get something else, then.

    yum13241 ,

    Because then people file bug reports for ages old software. Ancient does not equal stable!

    excel , (edited ) in Trying to understand JSON…
    @excel@lemmy.megumin.org avatar

    If you’re branching logic due to the existence or non-existence of a field rather than the value of a field (or treating undefined different from null), I’m going to say you’re the one doing something wrong, not the Java dev.

    These two things SHOULD be treated the same by anybody in most cases, with the possible exception of rejecting the later due to schema mismatch (i.e. when a “name” field should never be defined, regardless of the value).

    sik0fewl ,

    Ya, having null semantics is one thing, but having different null and absent/undefined semantics just seems like a bad idea.

    Username ,

    Not really, if absent means “no change”, present means “update” and null means “delete” the three values are perfectly well defined.

    For what it’s worth, Amazon and Microsoft do it like this in their IoT offerings.

    0x0 ,

    It gets more fun if we’re talking SQL data via C API: is that 0 a field with 0 value or an actual NULL? Oracle’s Pro*C actually has an entirely different structure or indicator variables just to flag actual NULLs.

    expr ,

    Zalando explicitly forbids it in their RESTful API Guidelines, and I would say their argument is a very good one.

    Basically, if you want to provide more fine-grained semantics, use dedicated types for that purpose, rather than hoping every API consumer is going to faithfully adhere to the subtle distinctions you’ve created.

    masterspace , (edited )

    They’re not subtle distinctions.

    There’s a huge difference between checking whether a field is present and checking whether it’s value is null.

    If you use lazy loading, doing the wrong thing can trigger a whole network request and ruin performance.

    Similarly when making a partial change to an object it is often flat out infeasible to return the whole object if you were never provided it in the first place, which will generally happen if you have a performance focused API since you don’t want to be wasting huge amounts of bandwidth on unneeded data.

    expr ,

    The semantics of the API contract is distinct from its implementation details (lazy loading).

    Treating null and undefined as distinct is never a requirement for general-purpose API design. That is, there is always an alternative design that doesn’t rely on that misfeature.

    As for patches, while it might be true that JSON Merge Patch assigns different semantics to null and undefined values, JSON Merge Patch is a worse version of JSON Patch, which doesn’t have that problem, because like I originally described, the semantics are explicit in the data structure itself. This is a transformation that you can always apply.

    masterspace , (edited )

    No there isn’t.

    Tell me how you partially change an object.

    Object User :

    { Name: whatever, age: 0}

    Tell me how you change the name without knowing the age. You fundamentally cannot, meaning that you either have to shuttle useless information back and forth constantly so that you can always patch the whole object, or you have to create a useless and unscalable number of endpoints, one for every possible field change.

    As others have roundly pointed out, it is asinine to generally assume that undefined and null are the same thing, and no, it flat out it is not possible to design around that, because at a fundamental level those are different statements.

    expr ,

    As I already said, it’s very simple with JSON Patch:

    
    <span style="color:#323232;">[
    </span><span style="color:#323232;">  { </span><span style="background-color:#f5f5f5;font-weight:bold;color:#b52a1d;">*op</span><span style="font-weight:bold;color:#183691;">": "</span><span style="background-color:#f5f5f5;font-weight:bold;color:#b52a1d;">replace</span><span style="font-weight:bold;color:#183691;">", "</span><span style="background-color:#f5f5f5;font-weight:bold;color:#b52a1d;">path</span><span style="font-weight:bold;color:#183691;">": "</span><span style="background-color:#f5f5f5;font-weight:bold;color:#b52a1d;">/Name™,</span><span style="color:#323232;"> </span><span style="font-weight:bold;color:#183691;">"value"</span><span style="color:#323232;">: "otherName"}
    </span><span style="color:#323232;">]
    </span>
    

    Good practice in API design is to permissively accept either undefined or null to represent optionality with same semantics (except when using JSON Merge Patch, but JSON Patch linked above should be preferred anyway).

    masterspace ,

    I.e. waste a ton of bandwidth sending a ridiculous amount of useless data in every request, all because your backend engineers don’t know how to program for shit.

    Gotcha.

    expr ,

    It’s about making APIs more flexible, permissive, and harder to misuse by clients. It’s a user-centric approach to API design. It’s not done to make it easier on backend. If anything, it can take extra effort by backend developers.

    But you’d clearly prefer vitriol to civil discourse and have no interest in actually learning anything, so I think my time would be better spent elsewhere.

    sukhmel ,

    Except, if you use any library for deserialization of JSONs there is a chance that it will not distinguish between null and absent, and that will be absolutely standard compliant. This is also an issue with protobuf that inserts default values for plain types and enums. Those standards are just not fit too well for patching

    masterspace ,

    I’ve never once seen a JSON serializer misjudge null and absent fields, I’ve just seen developers do that.

    sukhmel ,

    Well, Jackson before 2.9 did not differentiate, and although this was more than five years ago now, this is somewhat of a counter example

    Also, you sound like serializers are not made by developers

    masterspace ,

    Bruh, there’s a difference between the one or two serializing packages used in each language, and the thousands and thousands and thousands of developers who miscode contracts after that point.

    eyeon ,

    it does feel ambiguous though as even what you outlined misses a 4th case. if null means delete, how do I update it to set the field to null?

    paholg ,

    They’re semantically different for PATCH requests. The first does nothing, the second should unset the name field.

    expr ,

    Only if using JSON merge patch, and that’s the only time it’s acceptable. But JSON patch should be preferred over JSON merge patch anyway.

    Servers should accept both null and undefined for normal request bodies, and clients should treat both as the same in responses. API designers should not give each bespoke semantics.

    masterspace ,

    Why?

    Because Java struggles with basic things?

    It’s absurd to send that much data on every patch request, to express no more information, but just to appease the shittiness of Java.

    Aux ,

    Why are you so ignorant?

    XpeeN ,

    Why not explaining instead of looking down on people? Now they know they’re wrong bit don’t know why. Nice.

    Aux ,

    You’ve replied to the wrong person.

    arendjr ,

    JSON patch is a dangerous thing to use over a network. It will allow you to change things inside array indices without knowing whether the same thing is still at that index by the time the server processes your request. That’s a recipe for race conditions.

    expr ,

    That’s what the If-Match header is for. It prevents this problem.

    That being said, I generally think PUTs are preferable to PATCHes for simplicity.

    souperk , in Looks good to me 👍
    @souperk@reddthat.com avatar

    I am definitely guilt for that, but I find this approach really productive. We use small bug fixes as an opportunity to improve the code quality. Bigger PRs often introduce new features and take a lot of time, you know the other person is tired and needs to move on, so we focus on the bigger picture, requesting changes only if there is a bug or an important structural issue.

    NocturnalMorning ,

    I always try to review the code anyway. There’s no guarantee that what they wrote is doing what you want it to do. Sometimes I find the person was told to do something and didn’t realize it actually needs to do Y and not just X, or visa versa.

    ScampiLover ,

    I like to shoot for the middle ground: skim for key functions and check those, run code locally to see if it does roughly what I think it should do and if it does merge it into dev and see what breaks.

    Small PRs get nitpicked to death since they’re almost certainly around more important code

    derpgon ,

    Especially when you see a change in code, but not in tests ☠️

    souperk ,
    @souperk@reddthat.com avatar

    Yes, I always review the code, just avoid nitpicking the hell out of it.

    NocturnalMorning ,

    Yeah, sorry, totally misread your comment.

    breakingcups ,

    So you’re always behind, patching up small bits of code that don’t comply with your guidelines, while letting big changes with, by deduction, worse code quality through?

    souperk ,
    @souperk@reddthat.com avatar

    Not really, we are a small team and we generally trust each other. Sure there are things that could have been better, but it’s not bad either.

    gnutrino , in Looks good to me 👍

    This is why I always rename all the variables in the project on each PR.

    jol ,

    I know this is a joke, but it you did that I would reject the pr with the reason of too many things at once. Reopen separate PR to refactor variable names. I actually constaly get people doing this and it’s dangerous exactly for the reason you’re joking about. Makes it easier for errors to slip in.

    Lifter ,

    This will lead to change fatigue. People will rather not cleanup as they go anymore and just get the work done, with worse and worse code quality as a result.

    jol ,

    I prefer that than to sneak defects in huge PRs.

    silasmariner ,

    I know you’re playing the straight man to a joke, but actually you can apply a linter, then tell GitHub to ignore the implied ownership history for the purposes of blame from that reclining pr. All such prs are massive and yet by virtue of the replayability of the linter it’s also very easy to ensure errors didn’t slip in when reviewing.

    I know the original comment was about renaming all the variables, but that’s obviously deliberately absurd, so I’m using here a completely realistic example instead.

    refalo , in "No way to prevent this" say users of only language where this regularly happens - 07/01/2024

    lol this same post got flagged and taken down from HN

    verstra ,

    Well, lemmy is a place for much more cultured audience. We can appreciate a good shitpost (that does also hold some water).

    brrt , in Looks good to me 👍

    Just give them 10 lines at a time from the 500 lines one. Is this how micromanagement was born?

    ID411 ,

    It’s how elephants are eaten

    LeFantome ,

    If you do that, you will never get through the toenails. Been there.

    TheSlad , in Looks good to me 👍

    In my first programming job, I would actually do code reviews by pausing my own work, pulling their branch and building it locally, then using debug mode to step through every changed or added line of code looking for bugs, unaccounted for edge cases, and code quality issues.

    …I dont do that anymore, I now go “looks good to me” even on 10 line reviews.

    silasmariner ,

    Yeah but I bet you do it sometimes on your own pull requests even after you’ve opened them don’t you?

    Wilzax , in Surely "1337" is the same as 1337, right?

    Protocol Buffers are hated, but they are needed.

    JackbyDev ,

    Do you actually use them?

    Wilzax ,

    I’m a student so, yes and no?

    sukhmel , (edited )

    I do, but I also don’t think that’s a silver bullet, unfortunately. There’s convenience in code generation and compatibility, at least

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines