Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
Apple is looking for engineers to convert its code from C to Rust (jobs.apple.com) similar stories update story
381 points by itvision | karma 1530 | avg karma 6.14 2020-03-20 17:11:52 | hide | past | favorite | 240 comments



view as:

Seems like a tough role to fill.

For Apple or for employees you mean?

I would imagine that many people are going to apply.

I would have loved to write Rust at Apple, if I ever got the chance to. And Canada is on my list of countries I would like to live in already anyway.


Not everyone who applies is qualified.

Why? I don't have any Rust experience, but the rest seems reasonable to me and I figure I have enough C experience to make up for it.

If I happened to live in the area and was looking for a job I would definitely consider applying.


Because C programmers are 1000x more common than Rust programmers.

Edit: "only" 50x according to TIOBE.


It said Rust and/or C though, and I imagine learning Rust is pretty easy compared to C from what I've seen.

people who are very good at C or C++ will be able to learn Rust pretty quickly yes. People who LIKE C probably won't like Rust though. Its more like C++.

For what it's worth, I used to be big on C, and grew to very dislike C++ (because it's simple insane). Rust is my language of choice for years now.

IMO, Rust shares the same inner-simplicity that C has, but combined with ability to build and use bulletproof higher-level abstractions. The same way I roughly knew what machine instructions (roughly) my C code produces, I know what machine instructions my Rust code produces.


Well yes, because LLVM generates the machine instructions

Rust definitely does not share the simplicity of C.

It is the equivalent of C++, not C. You can write code which you cannot guess the machine code, as soon as you use any of the abstractions, just like C++.


Rust’s standard library is about on par with C’s, though.

Rust standard library is dramatically more useful than C’s. C doesn’t offer any kind of collections in its libraries. Every C project just grows its own differently broken hash maps and linked lists.

Compared to C++, Rust's libraries are severely anemic, especially in the areas where C++ is strong at such as collections, algorithms, and metaprogramming constructs such as type traits. Rust's standard library is better than C in certain areas–you identified one, which is dictionaries. (Linked lists are hard to provide a useful, performant API for; it's often better to roll your own–especially in C–than rely on a library to do it for you.) But even without including C's POSIX libraries, Rust seems to be missing things like random number generation. I general, yes, I would probably say that I would prefer Rust's standard library over C's, but it's not huge in any sense and much closer to C's than C++'s. In any case, I came into Rust thinking it would be like C++, and ergonomically it certainly is. But its standard library is nothing like C++'s and this caused me disconcertion until I began treating it like C's, and being happy whenever it provided me anything beyond that rather than disappointed whenever it was missing something I found useful.

Arguably not. Rust has a lot of things that C doesn't, like classes, lifetimes, and the borrow checker, which can be jarring to deal with for programmers used to other languages.

And, unlike most other languages I've seen, you can't just sit down and write a "bad" Rust program and then refine your abilities; the compiler won't let you do the "bad" things, so you have to get everything write from the word go.


Rust does not have classes.

In the wild, the meaning of class ranges from whatever it means in the bowels of theoretical OOP, to a struct that has thing.method() invocation syntax. I'd wager about 99% of people don't get more academic about what classes are beyond the latter definition since it's the only way in their language to create a group of state and then associate some methods with it, which is exactly what Rust has.

I think in the general case, you are absolutely correct.

In a sub-thread comparing language features of two languages to each other to compare their overall complexity, I think it pays to be precise.


From what I understand Rust makes many things explicit that are implicit in other languages. You always have to deal with lifetimes one way or the other.

Which is why I think it would be easier to learn, you get immediate feedback.


> Rust has a lot of things that C doesn't, like classes, lifetimes, and the borrow checker, which can be jarring to deal with for programmers used to other languages.

If you are a competent C programmer - the kind that "doesn't make" memory errors, you have to be manually keeping track of lifetimes and borrows.

Rust takes that mental effort and offloads it to the compiler.

Lifetimes and the borrow checker will likely be jarring for people coming from languages where you don't need to worry about memory allocation and ownership, but if you are coming from C or C++, you will likely find the borrow checker a relief (I did).

> the compiler won't let you do the "bad" things, so you have to get everything write from the word go.

And it's wonderful! Finding and fixing problems at compile time is the fastest and cheapest way to do it, and is also a productive way to refine your abilities.


I've just been recently learning some Rust coming from a heavy C background. The dynamic trait objects mimic what I do in C with vtable pointers and glue functions, and then sometimes I look at it over and over to make sure the casts are right and I haven't forgotten a free or anything. The borrow checker and lifetimes mimic what I do in C with pointer constness and function documentation, and then often I get very nervous and look something up again and maybe add extra intermediary comments on the data flow within a function if I'm not absolutely sure I remembered what the lifetime behavior contract was for a caller or particular callee, and of course I have to do this with all the structs too. The Drop trait mimics what I do in C with either straight-line fallthrough-chained exit labels as interior-exit goto targets or carefully included cleanups before a lesser amount of interior returns, and then I sometimes have to convert between the two for readability, and in the former case I have to carefully use a temporary variable if I also needed to return a non-constant at the end of all that. Compound-value ownership movement mimics what I do in C with either structure assignment or memcpy, and if the latter, hope that I got the pointer types and type size right because memcpy won't check, and in either case hope that I was able to prove that the original structure wasn't going to be incorrectly reused and maybe memset it to zero just in case that makes something crash rather than cause memory corruption.

I expect I'll run into more friction once I get to any point where I want to make more dynamically interlinked data structures, though that might also push my designs into directions that use fewer pointers and more indexes or other indirect references to start with (I'm not sure of that yet)—and I really like the idea that if I can pare some of it down to a working “core” data structure semantic that involves pointer juggling, I can put carefully constructed hard-packed blocks of the pointer juggling in their own unsafe zone and then not be able to accidentally break the invariants outside of that.

Which, again, is almost exactly the same thing I'd tend to do in C with inline wrapper functions and visibility stuff, and making sure I compute all the intermediary results in one phase and putting all the hard mutations in a second, unfailable (except possibly for hard crashes due to null pointers or such) phase so that if something goes wrong the data doesn't wind up crashing down to a persistent broken state from mid-flight!

Heck, I've even done the newtype pattern (single-element structs for safe reinterpretation/bounding of an underlying type) in C before!

I've described the way I write C as “as though I were hand-cranking the front end of a strict, low-level Haskell or Caml compiler in my head”. Rust is the closest I've seen thus far to moving a big chunk of that part of me into the digital part of the compiler. So I'm guessing my taste wasn't exactly uncommon.


Rust doesn't have classes. It has structures (struct) and interfaces (trait).

Rust syntax is strange but understandable to my C/C++/nim/python/elixir experience. I successfully managed to port a few test projects to it several months ago.

Didn't mind the experience at all except for a few niggles which actually are nothing to do with the language itself and more just some of the community attitudes. Some of the evangelists are... less optimal.


Wow, outside Edmonton... WTF. Do they have a data center there?

I think big companies see a lot of opportunity in locations outside of Silicon Valley. If you don't care about the weather in the Bay Area, which is quite pleasant, or the other industries around you (be it for business or for pleasure), then you can get an easier commute and a better house for the same salary, which is appealing to some people. (You can also pay people less, which seems to be a strategy used by companies without any Bay Area locations.)

As for how these offices exist; sometimes the team lead is senior enough to move the project there because they want it to be there, and other times a company is acquired that has offices there and there was no reason to move them. (I think Google got their NYC office by buying DoubleClick, for example.)


But then it's weird to specify a location. Camrose is a town of 18,000 people – why not make it explicitly remote?

Yeah that's a really good point... Is this perhaps one of those hyper specific job postings so that a company can skirt some sort of regulation about hiring immigrants by saying they did a search? but why Apple? And why Camrose...

The next Bend, Oregon!

Apple has a number of small offices around the world in addition to their large presence in the Bay Area and Austin.

It all comes down to world-class talent and their willingness to relocate. John Carmack famously joined Oculus without leaving Dallas, TX. I also believe that Chrome's entire V8 team is based out of Aarhus, Denmark due to Lars Bak's desire to move back home.

I think that may have been true at some point, but I believe only the Dart language team is there now. Could definitely be wrong though.

Interesting, this is for encrypting server-server traffic. My question: how did Apple end up doing this work in Camrose, Alberta?

I used to go there as a kid. It was one small town, and I don't suppose it has boomed since.

It's gone from about ten thousand during the 1990s to about twenty thousand now:

https://www12.statcan.gc.ca/english/census96/data/profiles/R...

https://www12.statcan.gc.ca/census-recensement/2011/dp-pd/hl...

http://municipalaffairs.alberta.ca/documents/msb/2016_Munici...

(It's nice to see that all North American countries are as bad at making websites; practically a tradition at this point!)


My hometown. From what I heard, an engineer wanted to move home and started a small remote office there. I think the office is moving to Edmonton.

Who da hell wants to move BACK to Edmonton ?

I'd consider moving back there for this job, I like Edmonton, I have many friends there still, and my family is there. And I like Rust. But I don't like the current provincial gov't and I'd have a hard time convincing my wife to leave her family behind here.

Edmonton has awful winters, but it has a beautiful river valley, decent arts scene, and good local culture. Summers are nice there. I hated much about it when I lived there in my teens and early 20s but enjoy going back to visit now.


Could you please stop posting unsubstantive comments to Hacker News? You've done it a lot, and we're trying for different than that here. In particular, please don't post regional flamebait to HN (or any flamebait).

https://news.ycombinator.com/newsguidelines.html


How is he managing to post these comments repeatedly? I need to make a new account here every day or two because I get flagged almost immediately.

Ah, is this your 45th account @gasthem_1? :P

There don't seem to be any other gasthem_n users.

No, on HN I usually just type something related to whatever is on my mind at the time.

No prizes for guessing what it was this time.


In this case we've banned the account because you can't have a trollish username. Those end up trolling every thread the account posts to.

In other cases it's probably because your new accounts are breaking the site guidelines. If you're sincere about wanting to use HN as intended, you're welcome to email hn@ycombinator.com and we'll look into it for you and be happy to help. But could you please not create accounts to break the site guidelines with?


D now has an Ownership/Borrowing system implemented (though it is in its early stages). It will work with existing code, including DasBetterC code. You're able to incrementally adapt a program to O/B rather than rewrite it.

Was this a response to something else in this thread that you accidentally top-posted?

It's cool, technically, but maybe a little off-topic.


I think it's just a shameless plug from the creator of D :)

Rust has shown that an O/B system is viable, and now D shows that an O/B system can work with C-like syntax.

What's your opinion on it? Good idea? Bad idea?

On an O/B system? I think it's a good idea, otherwise I would not have implemented it.

This is a 'mindshare' game. D had its chance, now Rust has its chance. Would have been cool though if D had caught on more back in the day. Re-invention seems to be impossible to avoid in the software world.

If rust still has people talking about missing C++ for faster compile times I don't think anything has solidified yet.

It hasn't solidified for the same reason that C++ in many ways still hasn't solidified. Software is a very young industry by many standards and we still haven't found the best way to make it. Rust is one way to do it, but there are many others besides, in the end which tool wins out is less important than that we make some progress and get out of the tar-pit that we've been stuck in for the last two decades.

All we seem to be doing is make more layers of gray goo and leaky abstractions that we then plumb together to attempt to make reliable systems. It still doesn't feel right to me.


idk man, people can dish out complicated, hard tested, reliable systems in a really short time. most aspects of most languages have been severely altered and improved...

Sure, we’ve made huge progress, and are still making huge progress, but it’s just not enough.

> idk man, people can dish out complicated, hard tested, reliable systems in a really short time

You must be living in some alternative universe. What I typically come across, even in fairly young companies, is tech debt ridden, buggy, endless cakes of layer upon layer of mostly cruft with a spiffy looking front thrown over the whole. Pretty icing on a crappy cake. As soon as you lift that top layer the illusion crumbles. I'd be happy to bet that most companies that run services out there would not survive a hostile audit of their code and systems.


For those not aware, Walter is the creator of D. I feel like this almost constitutes shitting on someone's Show HN post.

It definitely isn't, but if you feel like going for the most uncharitable reading possible be my guest. I respect Walter a lot more than you probably realize.

Hmm. Did you edit / adjust the "D had its chance" statement? I thought that ended with a period. If not, apologies. But it did come across as harsh :).

No, I did not.

D missed the train because the creator(s) didn't open source it, instead they wanted to monetize it right away

D has never had a price tag on it.

I am sensing a lot of hostility in this thread and Walter Bright, the founder of D, is replying to direct accusations in this thread.

Mods?

What's going on HN?


Novelty accounts making idiot accusations is nothing new here, unfortunately.

A quick google search shows D is written with the boost license? Looks to be pretty free to me and wikipedia says the boost license co stitutes as free and open source, just not copyleft.

Considering copyleft is arguably more restrictive than not, it seems incredibly unfair, and untrue, to say it's monetized.

Whether it was previously seems irrelevant now.


> D had its chance

What is stopping people to download a D compiler and write software in it right now?


Nothing. The problem is, by and large they don’t. D is a fantastic project and an important step forward, but Rust has won the mind share game.

https://insights.stackoverflow.com/survey/2019


For me, a mild addiction to learning nim/wren and porting code to them. D and Rust can wait a little longer.

People forget that there's a big world out there. We're all doing our thing.


Rust only proves that promotion buys mindshare. Since money buys promotion, and because of endless unscrupulous capital investment to two of Rust's most evangelical promoters: Mozilla and Cloudflare; Rust indeed has its chance. Make no mistake that is all there is to this.

Fifteen years ago Apple underwent a similar endeavor to convert all of its C++ code to Objective-C or even C. Apple isn't a leader in this decision-space; they're just a victim of this meme.


That's great to know. I'll to check it out. D is a great language and it's good to see it's getting even better.

D has failed long ago. It tried to be like C and in doing so essentially became another C++ over time. Nobody is going to invest into D and that is proven by adoption. The benefit is just too small...

Rust does the right thing. It keeps C ABI compatibility but utterly throws everything else out of the window and starts fresh. If you have the need for Rust, go for it. Otherwise stick with C++. Simple as that. D has no place in this discussion.


You do understand that you're replying to the founder of D, right? Walter put his whole life into D. Can we please show a little compassion for someone's life's work?

I don’t think they do.

The response clearly is reasonable, but no reasonable person would say that on someone’s face. So I don’t think Marta _moreno really knew about Walter


> Walter put his whole life into D. Can we please show a little compassion for someone's life's work?

While I appreciate the kind words, I am not looking for compassion. D must stand entirely on its own merits. D has already succeeded quite spectacularly:

1. we have a wonderful D community which has been utterly indispensible

2. D is part of the Gnu Compiler Collection

3. D has been accepted into the HOPL conference, which is quite an honor

A couple things we are particularly proud of:

4. D has been used by companies to make a LOT of money. They've told me that D is their secret weapon. :-)

5. contributing to the D community has enabled many people to advance into well paying and high status careers

6. D innovations regularly find their way into other languages


Godspeed, Walter. Been a fan since 2010 as I also come from a mechanical engineering background while being fascinated by programming.

TDPL sits next to KNR on my shelf


Thank you for responding. There's always a room for an awesome programming language. I've never used D but as a professional c++ programmer, I admire what you set out to do. (Assuming you are the creator....)

Please ignore the naysayers. HN is diverse in degree and area of knowledge and some people even think electron is the only cross platform gui framework.


Oh, I've always had plenty naysayers. If I ever paid any attention to them, I'd be a total failure.

I think D didn't gain the momentum because of GC. O/B system is interesting but makes the syntax ugly (at least in Rust's case).

Had D gone with a (deterministic) ref counting like Swift and Vala, it would've been much more popular I think. Such memory management keeps the syntax cute :) while not sacrificing the determinism.


We looked hard at an automatic ref counting system. We could not find a way to make it 100% memory safe while allowing pointer access for efficiency. An O/B system makes this workable.

D doesn't change the syntax for O/B.


I really like the D community and the language itself. However, I cannot but feel a slight disappointment after Andrei had thoroughly dissed Rust's model, and had promised to have a novel approach to memory management that would also get rid of the garbage collector (a year back? Two years back?), and now we have D following Rust's model. Maybe that's why we should be careful what we say in public.

I have all the respect in the world for Walter and the Dlang community in general, and wish the very best for D!


Walter, have you seen http://aardappel.github.io/lobster/memory_management.html ? If you looked at refcounting, this may be a way to make that compile time.

> handle the encryption of every packet passing between servers within and across vast data centres

This sounds like it’s doing packet processing in software, which doesn’t seem scalable, especially for the traffic volume I would assume Apple handles. Anyone have a clue what kind of traffic volume and bandwidth we’re looking at here?

Granted, I might be overestimating the requirements given the industry I work in (service provider routing).


> This sounds like it’s doing packet processing in software

From the article:

> Based on a custom implementation of IPsec

Seems pretty clear.


>custom implementation of IPsec

What could possibly go wrong.


I'm sure this is just referring to various 'extension headers' in IPSec, which is how the protocol was designed.

Sure, poor implementations can lead to problems, but the same holds true for just about anything.


None of the big clouds use hardware network encryption. All of the resources in your network are at the leaves, so you might as well use their CPUs to do it. I think Google published that crypto and compression in their internal RPC stack cost them less than 1% of their global CPU resources.

> I think Google published that crypto and compression in their internal RPC stack cost them less than 1% of their global CPU resources.

Not sure if this makes your point compelling. At Google's scale, if you were able to reduce global CPU usage by 0.1%, that would probably be a massive win.


Reduce it how? By deploying expensive, unreliable, and hard-to-manage vendor crypto garbage? That is undoubtedly not a trade that a Google would make.

You reduce it by using custom routing ASICs that can handle orders of magnitude more traffic than a CPU with equal or lower power consumption. To put things into perspective: Cisco’s latest routing ASIC — the Cisco Silicon One — can handle 10.8 Tbps (in fixed mode).

I’m sure Google and others have evaluated this, but it’s just kind of surprising that they opt to do per-packet processing in software.

Disclaimer: I write router software at Cisco.


10tb sounds like a big scary number but when you think about it that's only 100 100gb ports. It should be no surprise to you that a company making their own switches and NICs, where every RPC is individually authenticated and possibly also encrypted with session keys specific to the peer has no need of vendor "acceleration".

https://cloud.google.com/blog/products/networking/google-clo...

https://cloud.google.com/security/encryption-in-transit/appl...


10 Tbps is not a scary number? I am honestly still in awe of how such a small box can route an entire city’s worth of traffic. Different perspectives I suppose.

Regarding your points: I am not sure I completely follow.

Firstly, as far as I know, Google does not make its own switching or routing ASICs.

Secondly, virtually all switching and routing ASICs are highly programmable. So if you need a custom protocol, you can implement it using the vendor’s microcode (e.g., P4). In other words, you are not limited to the protocols that the vendor has implemented.

Given the above, I don’t see what kind of technical requirements Google has that would disqualify the use of routing ASICs.


We obviously have very different perspectives on this issue. I think given Google's architecture, where a host handles millions of flows and they are all individually keyed, just the key management nightmare alone would be enough to discredit the idea of offloading crypto to an ASIC in a separate box.

Does that actually accomplish the desired goal? Encrypting across data centers is one thing, but encrypting within a data center only makes sense if the threat model includes the datacenter's internal network being compromised – I suppose either physically or by hacking a router or something. I can't judge how much that threat model makes sense. But to the extent it does, a centralized router doesn't protect against it; in fact, if you're going to trust the router(s), there's no point encrypting the data at all. On the other hand, a per-server smart NIC would achieve equivalent protection, but maybe that's more expensive?

If you split secrets across NICs such that compromise of one NIC would only compromise a portion of internal datacenter traffic, then that would make sense. Again, as you noted, it depends on the threat model.

But from just the bandwidth and latency perspective, a custom ASIC makes more sense to me.


Theyre google, they are totally down with a 1% trade for flexible and maintainable systems.

but getting from 0..custom-asic is waay more harder than you make it out :o) plus you have a _custom_ stack on top of it, which is not malleable at all. moreover, getting a vendor lockin is probably not such a good idea anyways.

giving credit where it is due, intel's role in development of latest pcie-4.0 and upcoming pcie-5.0 specs and ofcourse the latest cpu's make it less and less appealing to consider putting up with custom h/w + s/w imho

with dpdk for example, doing 10g/core with 64b packet sizes is considered pretty standard these days. and if you don't want to roll your own, you always have companies like 6wind to help you (and ofcourse they share the code as well)

facebook/google etc. have, by now, probably at least a decade of experience running their own sdn enabled networks. facebook even has fboss to commoditize switch from white-label manufacturers...


SONiC is an interesting initiative by MS that reduces the impact of vendor lock-in by abstracting out ASIC specifics: https://azure.github.io/SONiC/

Even if you utilize every last cycle of CPU, the price/bps and energy/bit will still be way higher than almost any switching or routing ASIC on the market.


That doesn't mean you're doing crypto with integer operations. Newer CPUs support AES in hardware.

Well, of course. Practically all of the primitives in openssl boringssl etc are coded in assembly.

Seems like rust is this year's new hot language.

Rust is designed to be the language of the century. That's why it's named Rust.

40 years at least, according to a core team member: https://www.youtube.com/watch?v=A3AdN7U24iU

This year's hot new language of the century.

I didn't mean "no this is the one that's gonna stick", I meant "code written in this is explicitly designed to last a long time".

That's not a great summary of the history of the name.

Here's a better one: https://www.reddit.com/r/rust/comments/27jvdt/internet_archa...


doesn't Apple prevent engineers from having related engineering hobbies when they sign the contract ?

Usually.

> having related engineering hobbies

Like an engineering blog and open source projects? I would be surprised if a company can restrict employee's hobbies outside of working time.


They can if the outside work is related, which it always is if the company is large enough.

From my friends that work there - the answer is "it's complicated". Some of them do get away with it, obviously such as those that work on open-source software at Apple (like Safari, Darwin, etc) - or those with non-computer-related side-projects like photography or artisanal furniture making...

Apple seems to have inherited Microsoft's previous institutional paranoia about open-source software: the legal dept is concerned that if an employee casually browses GitHub and is inspired by some GPLv3 code that they could rewrite themselves for use in a proprietary Apple product then the lawyers consider that Apple product now possibly tainted (as rewriting code or being "inspired" by other code still legally counts as a derivative work, even if absolutely nothing was copy and pasted).

Microsoft lost that attitude around the same time Satya became CEO - I was at Microsoft when the transition happened and it was a refreshing change of policy that really came top-down. Of course we still had annual SBC training to remind us to always verify the licenses of any third-party or open-source components we use (and GPL is generally a big no-no, of course, without special dispensation from LCA) but the idea that a product's legal status could be tainted by casual browsing went by the wayside. I think a lot of the change came from a wave of sanity at the C-level when they realised the company was not actually being actively destroyed by meritorious - and meritless - open-source copyright violation lawsuits, and the few incidents that did occur (like the Windows ISO-to-USB tool debacle) were overblown with minimal impact to the company.

But Apple's paranoia isn't just about legal matters, but also out of concern that if Apple-watchers know who works for Apple and monitor their GitHub accounts then they'd be able to see which areas of technology interest those people, which may in-turn drop-hints about what Apple is working on (e.g. if we suddenly see a bunch of Apple folks Starring and forking LiDAR libraries for private use then that hints Apple is working on a LiDAR product... which they announced this week that they are: https://www.theverge.com/2020/3/18/21185959/ipad-pro-lidar-s...

Now, as someone who believes they'd otherwise make a great contribution to Apple as a valuable SE asset (yes, I'm being self-aggrandizing) this policy of theirs is a huge deal-killer for me. Not just because I own some popular GitHub repos with GPL licenses, but because I also have popular (and profitable) side-projects that only use a few hours of my time each month - and Apple is just as paranoid about those things as open-source projects are as vectors for leaking company movements, even unintentionally.

Heh - I remember shortly after I did NEO at MSFT and filled-out the "prior inventions" paperwork for my side-projects, my team lead admonished me for wasting his time looking at my dumb Google AdWords-supported websites - though he did agree the definition of "prior invention" was too vague.

(Footnote: if you're a recruiter or hiring-manager at Apple and you're reading this, and you agree that my side activities won't be a problem, please reply to this comment - I'd love to work on WebKit or Darwin at Apple :D!)


> Some of them do get away with it, obviously such as those that work on open-source software at Apple (like Safari, Darwin, etc)

To be clear, Safari is closed source, while WebKit is worked on mostly in the open. XNU is semi-frequently released as source dumps.


Apple makes you sign the same contract that e.g. google makes you sign, which says basically "I sign away all rights to anything I create to the maximum extent legally permissible under California law" (You can look up what those exceptions). For companies like Apple, this basically means they own everything you might do, since their "area of business" in essentially limitless.

However, once I signed away my rights, the experience at Google and Apple was quite different. At Apple, I waited months, with multiple follow up pings, to get approval from a lawyer for a one line trivial patch for an OSS project. I had to give an argument that my contribution provided a direct business benefit to Apple, and generating goodwill in the community was explicitly listed as a reason that is not valid. I couldn't contribute to any Google run OSS projects either (some issue with the CLA, not sure of the blame, TBH).

In contrast, at Google you are encouraged to contribute, don't need any approval for normal OSS projects, and I have easily gotten approval to release two personal projects.


And at Microsoft we didn’t need any approval at all for things done in our own time and own equipment - provided it wasn’t competing with anything the company was working on.

It gets better: during the Windows 8 launch and for the life of Windows Phone 8/10 we were actively encouraged to build own apps for their respective App Stores (cynically this was in-part to boost app count numbers, but also to make us motivated to dogfood the platform, provide feedback, etc). IIRC we were expressly permitted today use company hardware too - just not during core business hours. That said, I openly worked on my never-released clone of “Lose Your Marbles” during my downtime in our teamroom office during the day - right in front of our team’s Partner-level director who didn’t bat an eyelid...


Don’t believe everything you read on the internet.

Why wouldn't they use Swift? Seems odd.

Swift’s performance isn’t really up for high-performance network processing, at least not yet.

If it can't fill a network connection by now, I wouldn't hold my breathe.

Swift is a LLVM front-end just like Rust.

You are saying that like it must then be as fast as Rust by this virtue.

No, semantics and idiomatic code matters but I don't think performance oriented Swift code would be significantly slower than Rust. I might be wrong though - feel free to share relevant benchmarks.


Interesting, thanks! Ref counting has an overhead, of course. I wonder about the performance of the "performance oriented Swift code" - i.e. it would be non-idiomatic, but to avoid ref-counting - something you would write in your inner loop or other similar hot points.

Yes, they're programs from the benchmarks game that use a lot of UnsafeMutablePointer, which is: > provides no automated memory management or alignment guarantees. You are responsible for handling the life cycle of any memory you work with through unsafe pointers to avoid leaks or undefined behavior. [1]

So I think they're "performance oriented Swift code". I'm not familiar with Swift, so sorry if I'm wrong. [1]: https://developer.apple.com/documentation/swift/unsafemutabl...


That doesn't magically make Swift and Rust equally performant.

Why is this downvoted? It's a valid question. Chris Lattner, the creator of Swift, has stated many times his goal of Swift being the best language for all uses.

Maybe one guy's goal isn't the same as 100% corporate buy-in.

Either you misquote him or he has no idea what he is talking about. Maybe in a walled garden as Apple's it could come pretty close in many instances but One Language To Rule Them All is naive at best.

>Either you misquote him......

Not parent but no, I wish that was the case, he stated it numerous times and he compare it to scaling from Low Level C to Javascript. That was 3 - 4 years ago.

I was extremely sceptical of it because it was unrealistic, that is of course very much against the silicon valley optimistic view so it wasn't a popular opinion. But in a recent interview he has definitely rephrase and said it doesn't need to be the best in everything.

Now I have another problem, It is not the best at anything, what is it actually good at?

That is why I still think Objective-c is going stay for at least a few more years.


Ah yes, a creator of a language wanting his pet language to be the best language for everything. That's a first

Rust has more safety features and better runtime performance. So either or both of those reasons.

Why would they use Swift? It doesn't seem like the most fitting replacement for C level infrastructure.

I don't see any obvious arguments why. It integrates very well with existing C code, a lot can be done with structs and static dispatch, there are pointers, etc.

Not Rust level capabilities, but pretty good.


> Not Rust level capabilities

And you answered your own question.

If you aim for performance, "pretty good" doesn't cut it.


Sad but true, performance is possible in swift but it’s not very idiomatic and usually not safe.

Rust still looks like Rust and it’s still safe when you have performant code.


> Sad but true, performance is possible in swift but it’s not very idiomatic and usually not safe.

There's nothing sad about it. You pick the right tool for the job. Swift is not designed to be a reference tool in performance-sensitive applications, thus you pick the tools that are.

There is no need to shoe-horn the wrong tool just because it's popular of fancy. A tool is just a way for technicians to express their skills, and just because someone is skilled at developing front-ends that doesn't mean he is a competent at developing performant systems-level application. Specialization matters, not only in tools.


> There's nothing sad about it.

Why? It's a drop-in replacement for Objective-C that allowed you to dip into C and C++ code in the same code file, even in one function.

Now Swift is faster for most higher level scenarios a front-end developer deals with but it's slower than the C performance Objective-C allowed.


I.e. it isn't a drop-in replacement. There is nothing wrong about a language not being good at something. When you want to have everything, you get C++. And working with C++ is just sad.

> you want to have everything, you get C++.

C++ is an excellent example on the perils of developing a tool that's good at everything, because the cognitive load to do anything with it is simply not manageable.

Picking the right tool for the job is always the solution.


> If you aim for performance, "pretty good" doesn't cut it.

It does in the overwhelming majority of real cases.


But not at the trillion dollar company that is Apple, which is what we are talking about

Even for Apple, if "pretty good" performance were the price of avoiding another "goto fail" then it should be more than worth it.

Sure, for some hypothetical tradeoff you just made up then sure ¯\_(?)_/¯

But the reason companies as large as apple care so much about performance is because at their scale a 10% difference can easily mean 100,000 physical servers. So they do go to insane lengths to avoid "pretty good"


> If you aim for performance, "pretty good" doesn't cut it.

"Performance" isn't an absolute, and, going by the fact C++ is apparently acceptable in some "performance" codebases, there's more to it than directly controlling every single RAM allocation and making every single use of memory take maximal advantage of cache. You can't even get that in C without deep knowledge of internal compiler details.

Rust gives no finer control than C does, overall, but it installs some guard rails to make the language less accidentally unsafe. That's proof that "performance" isn't the primary goal with this codebase; if it were, it would be re-written in assembly, and the guard rails be damned.

So Swift isn't immediately out, unless profiling deems it so.


Swift is not yet a zero-abstraction-cost language like C or Rust which might be one of their concerns.

C is far from zero-abstraction-cost. For example you can't do a generic qsort supporting user defined types with zero overhead, other than with preprocessor macros (ugly, unsafe, unmaintainable).

This sort of standard seems hard to meet for any language, if the bar is "all abstractions should be included and zero-cost." In practice, you probably write a specialized quicksort for a given type in C to work around the lack of generic quicksort. You do this in the same way you manually convert your suspendable forkable computations into a struct that has all your "local variables" and an "instruction counter" inside to work around the lack of copyable coroutines in almost every language.

Depending on your choice of compiler, a more legitimate concern with C is maybe the lack of computed goto. See here about the nonzero cost of missing this feature: https://eli.thegreenplace.net/2012/07/12/computed-goto-for-e...


So there's an intrinsic cost to abstractions: namely that by using an abstraction, you're not specializing to the optimal extent possible; your general-purpose algorithm may well be close to ideal, but various details will not be; and even in languages that support specialization, you're likely to end up in a not quite optimal solution.

But that's not what people usually mean with zero-cost, right? They mean that assuming you do use the same algorithm, then the abstracted version is just as fast and memory efficient as the specialized version.

And that is a bar that C++ at least clearly meets, and, by the looks of it, rust is in practice much closer than C. Other languages have various features that address this need too; it's not that rare. C really is particularly poor here, but even C has macros, and although it's horrible in terms of usability, people nevertheless manage to do some pretty nifty stuff with macros.

Still; it's fair to say C is pretty bad at zero-cost abstractions, and that widely used alternatives support that a lot better (even C++ alone is a solid chunk of the market, more than C - https://redmonk.com/sogrady/2020/02/28/language-rankings-1-2...) Implementing a custom quicksort in C is a pain, enough pain that I expect most hand-tuned special case quicksorts are likely slower than general versions available for C++ simply because people aren't going to bother doing a good job; it's too much work. And that's just quicksort; the same goes for lots and lots and lots of other things too; reimplementing everything from scratch is just too much pain - in practice people are likely to not bother, and just pay the costs.

So while "zero-cost" abstractions are a slightly nebulous goal, clearly C in particular is worse at providing those than all of its low level competitors and wannabe competitors (C++, Rust, D, Nim, etc).


This is a great comment and I endorse it.

Lots of other languages make it way easier than C to write "fused quicksort". Although it doesn't pay much regard to programming language theory, Jai is another interesting language in this area, made by a guy who's mostly concerned with being able to express the fully specialized/fused version of everything without creating extra work for the programmer.


I think you've inverted the meaning of "zero abstraction cost" here. It's not that writing abstract code is well-supported and free; it's that the language basics aren't built upon a stack of expensive abstractions.

When I’ve seen “zero abstraction cost” used, it generally has meant that writing high-level abstract code is the idiomatic way to use the language, and that it comes with zero runtime overhead. (This is the main selling point of modern C++.)

Here is a widely-used example of a zero abstraction cost, type safe red black tree in C:

http://libredblack.sourceforge.net/

It supports a few modes of operation, including codegen, and preprocessor tricks. It’s clearly maintainable, given its age and continued use.

Your third complaint is that C with preprocessor macros is ugly. That’s subjective.

I think C++ is more elegant than C + macros, but the GTK people clearly disagree.


Reference counting.

I am like many others here surprised that this is in Camrose, Alberta. I didn't think big American companies would open an office here outside of the typical tech hubs (Vancouver, Toronto, Waterloo, and Montreal).

Search “apple engineer camrose alberta linkedin” on Google [1]

You will find a handful of people who either relocated during the last year or were hired there.

This includes people like Jeff Davey, Steven Bromling, Tony Gong, Derek Hausauer and Lucas Wagner who are brilliant.

[1] https://www.google.com/search?q=apple+engineer+camrose+alber...


That is nice. I think many inside Canada wouldn't mind relocating for a role like this, but I doubt someone would be willing to trade California's winter for Alberta's, regardless of the job.

> but I doubt someone would be willing to trade California's winter for Alberta's, regardless of the job.

I was just down around the LA area, and I'll gladly take -40 winters for the rest of my life than deal with that traffic and air pollution.


That's why they invented Santa Monica.

Spend a year away in the mountains or somewhere with fresh air, then come back.

You'll be shocked.


Population (2016) 18,742. Wow, Camrose is a really surprising location

In Canada you cannot call a software developer engineer unless he/she is an actual engineer

What's the definition for engineer? Anything that isn't software; ie electrical, mechanical, nuclear? Or does it just mean you passed the FE exam?

I believe you need to get a Professional Engineer license in the province where you practice.

Professional licensing along the lines of architects amd doctors - i.e. board exams and long multiyear supervised internship by a senior licensed engineer. At least that's how it looked last time I checked.

Usually it means you graduated with an accredited Engineering degree from university, and you are licensed by the regulatory body in your province to practice engineering. The laws differ from one province to another regarding what constitutes an Engineer.

Need a degree from an accredited school in engineering + pass the engineering licensing requirements. In Ontario, where I live, that means passing an ethics exam + some years of work that is overseen by someone with a P.Eng. The challenge in software is usually the last part.

Would this mean an engineer in the US who moves to Canada would not be able to practice without getting work accreditation and a license?

Yes, that's correct. Although is fairly easy to get accredited if you are a member in order of the state association of engineers.

Well glad I live in California! I’m a college drop out and all self taught. Not a member of any associations. Happily doing lots of great engineering here in the states!

But that’s not REAL engineering! You are only moving numbers around.

I don't know the details for Canada specifically, but I doubt not having a degree precludes you from registration - generally having one just expedites it.

I'd encourage you not to discount professional bodies as irrelevant, or incapable of becoming more relevant, our industry could benefit from it in many ways - perhaps most frequently on-topic on HN is the ethical and whistle-blowing aspect. Also plenty of professional development and networking, and that only improves as more people that get involved from different (or rather one's own specific) areas.


Not having a degree precludes registration... in fact, the degree must be from an engineering society accredited program, to ensure you get enough hours in front of a P.Eng prof or lecturer, among other things, I think.

One Canadian university's Comp Sci dept. started offering Software Engineering program one time and it ended up in a lawsuit the outcome of which is... complicated.

Canada is, I feel, very... lot of red tape.

The traditional term "engineer" isn't even a very good description of software dev anyway.

Software dev people should just make a new word up and abandon "Engineer".


The new word is "computer programmer".

If you work in tech writing software or building hardware this has no impact on you at all. You can still do the exact same stuff, you just can't have 'Engineer' in your title. This stuff only really comes into play for say civil engineers signing off on the final design of a project or a chemical engineer doing the same for some plant.

They cannot practice, as in they cannot stamp drawings, or in other words take responsibility for things that require an engineer to take responsibility for. What it means in practice is that either they work in a sector that does not require stamping drawings, or a P.Eng. has to review their work and stamp their drawings.

The exact policy depends on the province but an American engineer that passed the FE exam can easily become an Engineer in Training (EIT) and an American engineer that has more than 4 years work experience (reporting to an engineer) can easily become a Professional Engineer (P.Eng) in Canada.

In other cases, they look at your academic and work history, and may ask you to take FE and/or PE exams and/or an ethics exam and may ask you to complete specific engineering courses. It's actually possible to obtain accreditation through work experience alone (10 years IIRC).

You can even take the FE and PE exams in Canada for this purpose (and for Canadian grads that want to work in the US). Note that if you get an engineering degree in Canada, you don't have to take the FE exam or equivalent to start as an engineer in training, because the engineering schools themselves are accredited with very similar curriculum.

Personal anecdote: I have a Canadian engineering degree but I personally had no problem using work experience abroad as part of my four years experience towards my P. Eng.


How did the first engineer get accredited then? :)

He likely founded the association that sold accreditations.

Oddly, this doesn’t apply for the railroad. [0]

[0] - https://www.canadianbusiness.com/lists-and-rankings/best-job...


It does apply to the railroad in general but there are a few job titles grandfathered in that don't require a P. Eng. such as locomotive engineer (the other common one is power engineer) because those job titles were used before the engineering regulations existed and the job descriptions don't match the modern definition of engineering in Canada.

For engineering disciplines in the railroad industry such as structural, mechanical and electrical, a P. Eng. is indeed required.


A license, which sometimes requires a degree from an accredited engineering department.

I'm not 100% sure about the rest of Canada, but in Quebec: If you pay a yearly fee to be a member of the professional order of engineers, you can call yourself an engineer.

Sure, you have to have minimum a bachelor's from an accredited Canadian university, pass an ethics exam, and have supervised experience (or prove that you obtained the equivalent) to join in the first place, but if you don't pay the yearly fee you can only call yourself a "holder of an engineering degree".


We're not in Canada, we're on the Internet.

The same is true in Germany. "Ingenieur" is a protected title. You need to graduate from an engineering program.

anybody starting to feel like it's our "destiny" to "manifest" the American distinction, worldwide?

Please stop posting unsubstantive and/or flamebaity comments to HN. You've done it a lot and we've warned you before. Eventually we ban accounts that keep doing this.

https://news.ycombinator.com/newsguidelines.html


My last job in Canada the other Engineers got uppity about that with me.

I studied a degree called "Software Engineering", that was 4 years with first class honors. It is accredited by, and I'm a member of the Australian Institute of Engineers.

The only way for me to get accredited in Canada it 2 years of relevant work experience with an accredited Engineer in my field.

There are no Software Engineers in Canada.


> There are no Software Engineers in Canada.

This is absolutely false. For example, please review the "Software Engineering Experience Review Guideline" for Saskatchewan [0]. This was approved back in 2014.

[0] https://www.apegs.ca/Portal/Sites-Management/FileDownload/Da...


>Saskatchewan

Fine, so there are 27 software engineers in Canada.


Please don't do this here.

Uh, there's an entire software engineering program at the University of Waterloo: https://uwaterloo.ca/future-students/programs/software-engin...

Just throwing some info out there since I graduated from this exact program. We may graduate with an engineering degree, but we technically cannot use the word engineer in our title (in Canada) until we get our accreditations, which require a couple years working in Canada under a somebody else with their P.Eng license.

The vast majority of us (definitely > 90% of my class) have moved to the US after graduation where we pretty much can use whatever role we want (eg. my official title is software engineer at my company), but from what I know of the few people left in Canada, their roles are officially "programmer" or "developer". I've actually heard stories of P.Engs on messaging people in my program on LinkedIn to change their job titles to not include the word engineer. I'm not really sure what the repercussions are.


> we technically cannot use the word engineer in our title

Just for completeness: it’s OK to use the title “B. Eng.”, which means you have a Bachelor’s in engineering, without implying anything about being a professional engineer.


Is that so; it's been a while since I've taken my engineering ethics courses so I'm a little fuzzy on the details. Thanks for letting me know.

Question for you then: Can you work under someone who's a P.Eng mechanical or civil engineer, and then get your own P.Eng license that way (as a software engineer)?

IIRC there's no limitations on who you work under provided they are licensed; P.Eng license is a not tied to any specific discipline. If I complete my engineer-in-training (4 years professional industry experience under the supervision of one (or more P.Engs)) and pass the exam component, then I will also be licensed as a P.Eng. I may choose to work as a software engineer, or any other role, but being a P.Eng means that I uphold the standards, ethics, and discipline of the engineer title.

Thanks for the clarification -- I'm in CS, and did not know that.

Not sure why this topic is getting downvoted to oblivion.

The entire Google Canada workforce just had our titles changed from 'software engineer' to 'software developer' and were asked to update linkedin profiles to reflect this, as well.


Probably because it's _weird_.

Protected titles seem to have differing meanings in many places. Doctor is a pretty universally understood one; yet we throw around "Architect" pretty willy nilly and it's a protected title in many more countries than Canada.

Yet somehow Canada gets special consideration and other countries do not.

For context: I work for a French company, and we have many Architects despite that being a protected term. But we do not have Engineers in Canada (even if they have MSc CompSci engineering degrees).


I have been told that this in fact is only really an issue in Quebec. I am not a lawyer, but, I don't believe it's an actual issue elsewhere in Canada.

In Alberta too. No surprise really. Both Quebec and Alberta (also Ontario) have LOTS [1] of political influence from traditional engineering companies.

You know, not that long ago in Alberta, you could use "Software Engineer" without a P.Eng license.

Then the engineering society (APEGA) sued someone (Raymond Merhej) who did that [2]. But Raymond won in the courts.

So APEGA appealed. But Raymond won the appeal in the courts too.

So APEGA lobbied the Alberta Government to change the laws.

And APEGA won.

[1] An example from Quebec: SNC-Lavalin affair (https://en.wikipedia.org/wiki/SNC-Lavalin_affair ).

And from Alberta: "Kenney’s United Conservatives were elected last April on a promise to focus on oil and gas and bring jobs back to Alberta by reducing the corporate income tax rate and red tape" (Alberta government files red ink budget with focus on oil and gas, https://canada.constructconnect.com/joc/news/government/2020... )

[2] https://www.itbusiness.ca/news/it-industry-wins-round-in-eng...


Yeah, I'm from Alberta and went to the U of A (but for A B.A. philosophy [unfinished] not CS).

Back in the 90s I ended up having to come out here to Ontario to get work in software development; not just because there wasn't any work in Edmonton really but because companies there really weren't even going to look at someone without a CS or engineering degree. Every once in a while I muse about moving back there and I take a look around at job postings and I think even with my 8 years as a SWE at Google (and a 20 year career in dev generally) I might have a hard time landing a job there; lots of postings heavily emphasizing the academic angle and obviously trying to pull people straight out of the university.

It's a more conservative business culture in some ways.


Yes you can. You can refer to most engineers as engineers even if they don't have a professional engineering license in Canada.

This argument again? Every damn time...

Well, fair is fair. You don't want your local acupuncturist calling himself m.d. right?

We detached this subthread from https://news.ycombinator.com/item?id=22643168.

As a human of color, don't think I could live in this town but maybe team is really fun. Interesting to see a Camrose here! I've driven by numerous times. Feel free to ask any questions about Alberta.

> As a human of color, don't think I could live in this town

Are you saying it's a racist town, or yourself objecting to its current demographic?


Not OP, but I would imagine being one of the only minorities in a town like that is difficult if you just want a nonchalant life.

Thanks, that's what I meant.

I didn't mean it is a racist town. The population of the town is 18,742 and from years of living in Alberta, I don't think I'd find people with similar taste or interest in activities or topics I'm interested in.

Converting code seems like an incredibly boring job.

Maybe for you... I find it pretty interesting... though you have to first learn, understand and document what it is doing, and then think about how you can adopt the function into the patterns of the new language.

That last part is key. Nothing like seeing code in a current language written like it's mainframe code. I've seen some really poorly ported code (and written plenty). The learning curve is pretty steep but it's great work if you enjoy it.

For me, my favorite part of working in software is the learning. New tools, languages, platforms, systems not to mention domain knowledge. In my career I've worked from government, banking, aerospace elearning, eCommerce and many things in between. What other career is such a great opportunity for continuous learning.


> Following a very successful first foray into Rust we are migrating an established codebase from C to Rust, and building new functionality primarily in Rust.

That doesn't say the job is ONLY converting code. It says that code was converted, and new code will be in Rust.


Any thoughts on why they’d be using Rust rather than Swift for this work?

The vast majority of Apple’s data centre systems run on Linux.

Swift compiles and runs on Linux.

It’s only supported on Ubuntu, not Linux in general, and it still lacks. The corporate backing is missing, and IBM going away from the Server-side Swift deal doesn’t spread confidence. The whole ecosystem just isn’t there (yet?).

This rewrite isn’t only related to the current situation on Linux, but also to Swift’s current performance characteristics. Lack of full Linux support is just another hindrance, before you can consider Swift for this type of work at all.


If it runs on Ubuntu it runs on nearly any Linux distribution.

Being useful and performant and effective, especially for a use case as critical and sensitive as security software, requires clearing a much higher bar than just 'compiles and runs'.

If you're writing performance sensitive code and/or correct code, Rust makes more sense. Rust makes more sense for almost any use-case over Swift unless you're building Apple clients.

Swift still hasn't really escaped its niche of being a language for end-users to build clients for the Apple ecosystem with trade-offs in mind for those end-users who are just building clients (rather than, say, implementing a safer openssl).

For example, even Rust's web framework story is more mature and benchmark competitive than Swift's. Go to https://www.techempower.com/benchmarks/ and filter for just Rust and Swift.

I say all this sadly as someone who builds iOS apps. I would find it very weird that someone was using Swift of all language for infrastructure projects.


At the present time, I agree with you, Swift is very good for user facing software and because of the new built in support for auto differentiation it looks very promising for deep learning. I have some bias here: I am working on a book on Swift AI development. Common Lisp has been my go to language for AI since 1982, and I wanted to explore an alternative. Specific to Apple’s ecosystem, support via CoreML is very good, built in NLP libraries are also good.

I wouldn’t build network software in Swift.


Why not Ada language?

My best guess: much more people have come in contact with Rust than with Ada. I have done a few hobby projects in Rust, but trying Ada hasn't even crossed my mind.

Rust has the better ecosystem. Ada has nearly nothing in terms of open source community.

I honestly thought 2019 was going to be the year we saw Rust take up by companies, but 2020 seems even more promising given the amount of people already using it in production at the last Rust Melbourne Meetup. Rocketship!

Why not incrementally convert it to modern C++?

Probably wanting memory safety.

I don't write C++ though I have written some C. I imagine it's easier to shoot yourself in the foot with C++ compared to Rust; it's specifically designed to be a safe language, whereas C++ had things like auto_ptr added

Why have a half-way solution of porting to modern C++ when you can just write it in Rust?


auto_ptr is gone since C++17.

If the code base needs to be rewritten anyway, why not to a memory safe language. Can be converted incrementally to Rust, too, because in the end its all machine code.

I'd love a Rust job, but there are so few roles out there, nearly everything is Javascript/Python/Java. Is it worth even learning?

Apple is looking for engineers to convert its code from C to Rust

>nearly everything is Javascript/Python/Java.

That's depends on where you are living (if you are looking for a local job of course).

But sure, Rust is a pretty new language with a relatively high entry threshold.

Is it worth learning? Yes if you are planning to do system\relatively low-level stuff in the future.


IMHO it's worth learning at least to internalize the concept of ownership. Ownership in some form exists in most languages, but Rust makes it very explicit. It guides you towards clearly organizing program's data into isolated tree-like structures, instead of a web of everything referencing everything else. Like Lisp, even if you don't use it, it may change how you think about code.

I see companies adopting Rust internally. Instead of hiring "Rust developers" they just add Rust to the stack and let their devs learn it. For example, Cloudflare writes most new code in Rust, but Rust is barely mentioned in the job postings.


> Cloudflare writes most new code in Rust,

I think you might be over-stating this. We do write a bunch of Rust though!


I mean newly started projects. Of course there's a ton of existing code to maintain, but — at least in my team — Rust is the default for new services.

I am glad to hear it. My team doesn’t work that way :) (we actually just did a spike in Rust and are going to end up doing the final version in Go, for a few reasons.)

Legal | privacy