I'm not an iOS developer but for the native modules that I developed for Cordova or React Native, Swift is definitely a much better and cleaner language to develop it.
I knew there was some churn in the Swift language but didn't think it was going to continue. I'm not using Swift now but will likely hold off a bit longer as a result. I did think they nailed the open source launch. Swift on the server is an exciting prospect, more competition is always good. While I'm not a Swift dev yet (I don't have a Mac thus mainly interested in the libre crossplatform launch), I've been keeping my eye on it with Swift Weekly.
I think 2016-2017 could well be the year(s) of Swift's ascension.
Have been using Swift since late 2014 (having used ObjC for many years prior to that).
Even with the occasional churn (eg move to 2.0 or 2.1), the sheer gain in productivity due to brevity, type checking, cleaner closures, and various other language features, easily outweighs any costs. I find myself being able to do more with less code, and cleaner code. It's cognitively way easier to build bigger abstractions.
We do have a few legacy bits that makes no business reason to port at this point, but Swift-to-legacy interop works well, too.
I would say the biggest step that made things usable was introduction of incremental builds in 1.2.
I think that approach (waiting to adopt) makes sense if you aren't already an ObjC developer. Less to unlearn. However, for an ObjC dev, waiting to adopt is only going to put you further behind the curve (the hard part won't be absorbing the changes from Swift 2 to Swift 3, its dropping your ObjC habits and learning what it means to write idiomatic Swift (which isn't going to drastically change between Swift revisions)).
At some point soon if you want to find recent examples for iOS 8 and 9, they'll all be in Swift. iOS 10 will be announced in 4 months. Books, blogs, and perhaps all of Apple's examples, will be in Swift.
Sadly though, a lot of the books either are or will be out of date as more updates come to Swift and break existing code.
I've been working on and off with Swift since the beginning, it has a lot of great characteristics, but updates breaking code isn't something most developers have time to worry about.
It's tough. I had a book out in 2014 and it's just been re-released as a second edition in 2016 (Swift Essentials) with fixes for all the changes introduced since the initial release. And you can't run the fix-its on a book :-)
One thing that has helped with the new openness of Swift's evolution is that I included several upcoming warnings, such as the demise of the ++ and -- operators and the 'standard' for loop.
There will have to be a phase change in the future when Swift moves from a "move fast and break things" to a "stable evolution". It has been suggested that this will begin with Swift 3 (when a stable ABI is introduced) but I suspect it will be Swift 4 or 5 when stability is really nailed down.
Its funny yesterday I had a meeting with my 2 head course professors talking about this exact issue. A capstone project essentially requires a objectice c ios app but they have stopped producing reference books for them in IOS 8 && 9.
I've noticed that, and find it annoying that all of the major third party iOS learning resources- Big Nerd Ranch, Ray Wenderlich, Objc.io etc. have switched over to Swift without retaining Obj-C support, even though it's far from deprecated on iOS 9, and Swift is far being in a stable configuration. I have to applaud NSHipster for having code snippets in both, at least, even while they're books are now Swift-only.
Pretty happy the Firefox for iOS team made the decision to go fully Swift about 14 months ago :-) Almost 60,000 lines later, no regrets. It is awesome.
"The standard makefiles for SQLite have a target for building an object we call the "amalgamation". The amalgamation is a single C code file, named "sqlite3.c", that contains all C code for the core SQLite library and the FTS3, FTS5, RTREE, DBSTAT, JSON1, and RBU extensions. This file contains about 184K lines of code (113K if you omit blank lines and comments) and is over 6.4 megabytes in size".
Rough estimate is the Outlook app has over 70k lines of Swift, which is just over 50% of our codebase (the rest is Objective-C). That doesn't include external libraries.
One of our devs starting adding Swift code to the project about a year and a half ago. After that proved successful, the rest of the dev team then started getting up to speed. For the last 8 months, all development has been done in Swift.
We don't arbitrarily rewrite existing source, but if we're making changes to an existing Objective-C file, we evaluate if it would be worthwhile to rewrite in Swift. Often the answer is yes.
From our perspective, Swift is absolutely ready for primetime. For a large codebase like ours, it takes some time before you see direct benefits, but if you keep with it, I'm sure you won't regret it.
I should mention the downsides: the migration to Swift 2 took time, but it wasn't awful. I'm sure we'll have similar issues moving to Swift 3. Also our compilation times are quite a bit slower than with purely Objective-C.
It's totally different. Google officially stopped development and support for ADT in Eclipse. But Apple is still supporting Objective-C and most Apple's apps are written in Objective-C.
Apple is supporting Objective-C out of necessity (both for themselves and others). It's still very clear where the future of iOS development lies, and how few disadvantages there are.
That they continue to support Objective-C has little to do with whether or not you should be using it. You should, clearly.
It will be time for Swift when they stop making backwards-incompatible changes to the syntax and I can be sure my code from today will still compile tomorrow when I download the latest version of Xcode.
The alternative is to either stifle improvements, or to bloat the language. And we all know what that looks like looks at C++
I personally am happy they're making backwards-incompatible changes, and I hope they always continue doing that. I'm happy to maintain my software in return of working with an elegant, cutting-edge, small, logical language.
Thanks for bringing rust into the discussion - they handle language standardization/evolution very effectively, in my opinion. "Experimental" features are added to the language, but they are all effectively #define-gated for at least one release before being concretized (it's done more pragmatically than a literal #define of course, given that rust doesn't have any concept of #defines).
This makes sense in my opinion because it gives an opportunity to evaluate proposed language changes under real-world use-cases before making it official, and people must explicitly opt into the possibility of their code not working with future releases, thus fully knowing and accepting the risks.
Swift is a neat language but unfortunately doesn't fix the extremely verbose and archaic-feeling NextStep libraries that make Objective-C a pain to read or write:
let bar = foo.stringByReplacingOccurrencesOfString(" ", withString: "", options: .LiteralSearch, range: nil)
I know exactly what the first one does, without knowledge of the underlying API. I have no idea what the second one does. Does it replace occurrences of the first parameter with the second one, or vice versa? I assume it returns a copy of the string? What do I need to do to perform a case insensitive search? etc.
That's a true statement. Yet I still think the second way is superior.
There's a steeper learning curve sure. But brevity has value.
Is grep a stupid ass name that makes no sense to the layman? Of course! Can even moderately complicated regular expressions actually be read by anyone after the fact? Hell no. And yet it's a fantastic tool.
If there were a dynamic toggle between verbose and brief newbies would start with verbose. I bet most users would eventually switch to brief. Not all, but most.
Do you also pronounce every abbreviation? Do you say "International Business Machines" instead of "IBM"? How about "Répondez s'il vous plaît" vs "RSVP". "As soon as possible" or "ASAP". Ever use a contraction? How about a keyboard shortcut?
I certainly agree it's potentially more readable at first. I'm not yet convinced, without actual data, that anyone, given a month or so experience with a given and common API wouldn't be more productive with shorter names up to a point. I would be good to get that data. Maybe I'd be better off with
a = Math.numberFromAddtionOf2Numbers(c, d)
e = Math.numberFromDivisionOf2Numbers(numerator: b, denominator, c);
vs
a = c + b;
e = b / c;
which is best
s = str.stringByReplacingOccurrencesOfString(" ", withString: "", options: .LiteralSearch, range: nil)
s = str.replace(" ", "");
s = str.r(" ", "");
This is a bit more verbose if you need to chain a lot of replacements... but chances are you shouldn't be doing that in the first place! Instead there should be a version that takes a dictionary, like
str.replacing(["<": "<", "&": "&"])
This is better because most of the time you don't actually want characters produced by earlier replacements to match later ones, and it can be faster for large strings.
String.replace is now a builtin standard in most languages, there is no real need to explain it anymore (apart from documentation). It would be like replacing 'grep' with 'fetchAPieceOfStringInFiles'
I started to consider you had a valid point then I went back and read "stringByReplacingOccurrencesOfString" again.
Can you really defend that as an identifier in any sane API? It's utterly abhorrent to my eyes, in the same way typical Microsoft bastardized Hungarian looks. Actually - it's worse.
Agreed, and also: the insane long version is just str-<Complete> and a couple arrow taps and it auto-completes.
So it's just as short, from a typing perspective. With a lot of free bonus clarity.
I lived through the bad years when Apple's dev tools were half-baked and made you almost pine for the first dot-com bubble when you were writing Java (but in IDEA IntelliJ, which made it almost worth it).
But today Xcode is state-of-the-art, at least in terms of autocomplete, and it makes a big difference.
It replaces the (nonoverlapping) occurrences of " " with "" in str.
Regexes have facilities for case insensitive replace, as in most languages.
Maybe I've been programming in Python too long, but str.replace is intuitive to me (unlike, say PHP's str_replace).
I'm guessing you don't program in Python much, judging by your question of string copying.
What bother me the most about Swift,
is it essentially a C# clone, actually worse, because C# has more features. It is a travesty that Apple chose to design yet another shit language when there are so many other alternatives. And now, like with Go, everybody is jumping on the band wagon because its Apple.
That page reads like someone who doesn't know much about other languages cherry-picking and making dubious connections. A fair chunk of the Swift examples listed strike me as radically different to the C#
While I agree that Swift and C# share a lot of commonalities and are fairly similar languages, the better integration of Objective-C alone makes the new language worth it.
I don't know anyone in PL who thinks Swift is a bad language. It's a nice, conservative, tasteful design that uplifts some proven functional features. What's not to like?
I don't think it is a bad language but I think it sometimes introduces unnecessary complexity: custom getters and setters on properties, will-set/did-set blocks and operator overriding (including precedence and associativity). Also, coming from a background where pointers are an obvious thing I do not understand why classes and structs need to be separate data types. None of these things is a bad per se but I do think they make the learning curve steeper and invite some level of abuse. Reminds me a lot of C++.
I've working in C# for eight years before switching to iOS. To me, Swift and C# similarity ends with use of curly braces.
C# is, in essence, better Java. Much, much better Java. Swift is a different language, it implements some new ideas (much more advanced type inference) and forces the programmer to thinks in a different way (by trying to eliminate nullability).
Swift is not a perfect language (my biggest gripe is very awkward generics implementation), but it is a good language, and it's getting better. Saying that it's "a shit language currently on the spot just because Apple" isn't fair, I think.
I love C# and think it's a great language, but when it was as old as Swift is now it was widely perceived as a Java clone. In fact, even today C# and Java are much closer in language design philosophy and implementation than C# and Swift, and calling Swift a "C# clone" betrays a profound lack of familiarity with what Swift is or how it works.
That sounds really trivial to me. You've described less difference between Swift and C# than exists between C# and Java and those latter languages are commonly thought to bew very similar.
And C# is a Java clone which is a C++ clone which is a C clone. Who cares if languages take other languages as inspiration? Would you prefer a Disney-like situation where innovation is stifled because that is no longer possible?
Int is a 64-bit data type and Int32 is 32-bits. The compiler refuses to infer that you want to lose data. Converting the other way, the compiler refuses to infer that you want to bloat your data.
CGFloat, depending on the runtime, may either be a Float or a Double, so explicitly converting to and from it is required.
Rather than create a byzantine set of rules about whether it is "safe" to convert from one type to another, Swift wisely requires that all type conversions be explicit.
Kotlin looks somewhat like Swift (or Swift like Kotlin, depending on your perspective) and you get all the tools and libraries that have been built up over the years for server side programming on the JVM.
"Not sane"? You can literally unzip a JVM and then run "java -jar myapp.jar", how is that too complicated for modern day sysadmins?
But if you want a single copyable deployment then look at the javapackager tool in Java 8. It will take a unified JAR (which any build tool can produce) and create a .deb/.rpm/.tar.gz with a bundled JRE inside such that you can install it and then run your app in the usual UNIX way (by name). You wouldn't even know it's Java under the hood.
Kotlin doesn't need any web frameworks because it's 100% interoperable with Java, so you'd just use any of those like Play Framework, Ninja framework, Spring Boot, etc.
I'll have a look at the javapackager tool, thanks. For now the only jvm web framework i've used and found elegant enough is play, but since it's coded in scala, i don't feel like using another non-java language on top ( i already felt like using java for play was like using a second class citizen). Not that it won't work, but it has all the chances of creating weird code patterns and edge cases, and feel non natural at all.
Should one wait for Swift 3? The API/syntax changes are a concern. Code breaking changes aren't that great. Also last time I checked Swift still requires a 64bit iOS device.
Apple provides code migration support in Xcode, so when they break things it's mostly a mechanical and sometimes automatic update. I migrated our 50K lines of Swift code from Swift 1.2 to 2.x in a few hours.
Something to think about from end user Joe, here: A quick search seemed to indicate that Swift support (officially) began in iOS 7. Great, that's the version I'm on. If anything coming down the pipe for Swift depends on a later version, I'm not on board. I'm not going anywhere. A lot of users, myself included, have been burned when a version upgrade runs like a dog on old hardware...but unfortunately, I'm not going to spring for a new iPad every year, much to the chagrin of Apple. It'd be interesting to see how App Store success correlates with how many versions you can support.
Why is he talking as though using objects is out, and using value types is the only valid path with Swift?
Especially during a transition phase, which we are inescapably in right now, wouldn't it be pragmatic to continue using objects when needed, even if the eventual goal is to end up transitioning to protocol oriented programming?
In other words we don't need to conflate Swift with POP, which I think is what he's doing when he talks about the purported problems of NSNotificationCenter.
reply