So, embedded (IoT) and a bit of machine learning is what Japanese developers do with Ruby? The article is a bit thin on the "what they actually do with Ruby aside from Rails" side..
There was already a long history of app development when the smartphones came around, especially in Java, and iOS (or really any Apple product) japanese support was excelent from the start.
IMO on the mobile side, there was already a strong native culture that made it a harder sell to go the cross platform route.
It's a good question because I understand Japan to be the region in the world where iOS dominates smartphones the most (over 50%?). I know RubyMotion has Android support and it continues to get better but I think it's fair to say RubyMotion is strongest for iOS/macOS work.
The company is under new ownership/management since very recently ... perhaps Amir or someone will look at regional situations and gain more mindshare for the platform.
I think "not Rails" should be expanded a bit, because it risks someone saying "ok, so... sinatra?"
Web applications have some very distinct profiles. Most requests:
1. Parse HTTP
2. Parse the request body (likely JSON), build ruby objects from the data.
3. Load data from a datastore, builds ruby objects from the data.
4. Do some manipulation of data.
5. Serialize Ruby objects into a response.
Basically, because of the stateless nature of a web request, most of the work done (ignoring IO) is serializing and deserializing data.
By not having Ruby be responsible for the state of our application, we make Ruby responsible for fetching and sending the state.
Now think about a stateful system (like we would have with embedded Ruby). The state is kept within Ruby for the most part. We don't have to hit a data store to restore state every time we want to do something. This significantly changes the operating profile of the application, and therefore the optimization strategy changes.
Even in distributed systems, coordination of data from stateless systems can be a nightmare. Requiring a bit of stickiness simplifies a lot of things.
For example, I've worked on a few large scale adserving auction platforms. One of them had the auction and impression events recorded statelessly (ie each event was generated on a random machine). This required a big hadoop job to match up all the events from a single auction. Another system used sticky events, where all events for an auction would hit the original server that ran the auction. It would store all events for an auction in memory and then flush it to the billing system when the data was finalized. Was it perfect? If the one machine crashed, all pending ad info wad lost. But that didn't happen often so the cost was negligible.
I don't know about its popularity, but from what I've seen it doesn't seem to be on the hype train anymore. No technology stays on the hype train forever. Of course this is just based on observing trends, it shouldn't dampen your enthusiasm if you've found something that works for you.
Rails still solves all the problems it solved last year or two years ago. I'm always astonished when teams give into the hype train and replace the two things Rails excels at, templating and database access, with a 10MB frontend JS blob and microservices running in a JVM language.
Just use Rails, it scales just fine. Your hand-coded DB layer for your microservice will never be as good as ActiveRecord if you aren't Facebook.
Why do you say that Rails excels at templating? Ignoring for the moment the obvious differences between server-rendered HTML and JS frontends, the Rails templating doesn't strike me as one of the most prominent solutions Rails provides. The templating is fine, but other than a few niceties (like layout hierarchies and resource-based forms) it's not much more than a thin syntax to embed Ruby eval inside HTML.
If I had to pick the top things that Rails excels at, I'd point to ActiveModel and resourceful routing. I know REST isn't the hotness any more, but it sure is great for CRUD web sites or APIs, and Rails makes setting up database models, associations, and REST routes a breeze.
I also don't think Rails templating is anything special but it's well designed.
One strong point is that it's full Ruby, so there is no need to learn another language. This means that one could write the application in the views (just like vanilla PHP) but in my experience nobody does it.
Another nice feature is that every instance variable of the controller is available in the view. No need to waste time to declare them and get an extra chance to add some bug (I'm looking at you, Django.) I still remember how good it felt coming from Java Struts in 2005.
I'd go so far as to not only agree with your assertions of the best points of Rails, but to also extend your skepticism of the excellence of Rails' templating to calling that the worst part of Rails. Of course, one can use Rails as an API only service and it's quite adept at that.
I don't know if I would call Rails templating excellent, but it is acceptable. The nature of HTML allows for quite a lot of flexibility. However, I have to disagree with your view of Rails in API usage. Maintaining a client contract for fields and value types is far more difficult in Rails, at least out of the box, than similar offerings.
If you send the wrong HTML tag, the content just might look a little funny to the end user. If you send the wrong datatype to an API client, it generally won't be able to do anything with it, and may not fail gracefully at that. Constructs to ensure you cannot make that mistake are invaluable in API design.
I wasn't looking for advice about web frameworks, but since you offered it, do you believe developers using other web frameworks are misguided in doing so? It's not like Rails is the only game in town, comparable solutions exist.
I'm talking about teams that use Rails as router, microservices as ORM, and a heavyweight React behemoth on the front, just for simple CRUD. Those projects would almost always be better off using straight Rails.
Node is going after that space so you can see an increase in server side javascript stuff coupled with a decrease of Ruby and probably some PHP and Java as well. Ruby is the most vulnerable of those.
With Python people are doing data analysis and machine learning in addition to web development, with JS you can do frontend web development and if you use JS for the backend you can even do server side rendering for React/Angular. With Ruby you just have Ruby on Rails.
The Ruby story in Europe is very similar to the American one: Rails, Rails and Rails. However the last thing I wrote in Ruby is a scraper for a web site, a few hours after patching a Rails app I've been maintaining for a customer for at least five years.
About more non Rails stuff, I started writing small text processing scripts in Ruby a few years ago. I wrote them in Perl before, but I eventually got more at ease with Ruby, notwithstanding the compact convenience of Perl's while (<>) loop and the implicit input variable. I also made a mruby program run on AWS Lambda just because I could. Interesting but probably not extremely useful in that context (pun intended).
I wonder if there will be a second Ruby wave from Japan, if their community keeps growing and their use cases start reaching abroad. Remember that the first wave (Rails) was actually from the USA.
Hopefully! The problem with ruby is it's just fine as in good enough but not better or more compelling than X or Y.
I hope Crystal changes this or there will be some breakthrough in the mri.
I'm a non-typical consumer who simply has used Ruby as a Python drop-in for text manipulation and interacting with API's.
Overall I found the philosophy to Ruby to be a bit more elegant than Python's, but I struggled with some of Ruby's philosophies about what an object is and how to store them.
In general I enjoy Ruby as a general purpose language, but code I wrote to interact with REST APIs was eventually replaced with Python at my work because "not enough people know Ruby".
What is Ruby's killer feature to help drive adoption in situations such as these? By "these" I mean as a general-purpose scripting language.
There is no clear cut killer feature for Ruby over Python in that context, and viceversa. Both are fairly good compared to the alternatives. The only true killer feature of Ruby is Rails, much better than Django and Web2py I had to work with in the last year, but that's out of the scope of your question. If your team knows Python you're likely to stay there.
I find Ruby to be much easier to read and understand than Python but I have 10 years of customers paying me to write Ruby vs one in Python. I assure you that it would be the other way around if I had worked in Python for 10 years.
I never picked Python for my own projects in all those years because I find many of its design choices weird and painful (I joke saying it's Ruby for the masochist), but that is very subjective. People liking Python consider good even what I find to be objective weaknesses of Python (example: only one line lambdas.) This means that probably both languages are good enough.
Good answer. Our discussion sort of touches on the ingrained biases organizations can have toward a particular language, whether for the good or bad. Ideally I would push towards learning new languages and approaches to problems, but organizations (mine included) push towards the principle of least astonishment and it can be difficult to break that barrier. There's good business sense in doing so. Or, to rephrase, if your organization is ingrained in doing things a particular way the tradeoff of attempting to adopt something newer and/or different had better be justified.
This is nothing particularly insightful on its own, of course, but this philosophy in terms of "what language do I write my code in" is worth a retread every now and then.
I have some experience using RoR but I'm considering Django for my next project. Could you elaborate on what you liked more about RoR compared to the later?
The answer could be extremely long but the single most important point is that the Python frameworks I used leave developers on their own to do architectural choices. Combined with some poor framework defaults [1] I always saw very poor project structures (my sample space is 3). I inherited projects with a single thousands line view (Django) or controller (Web2Py). Rails would basically force developers to create multiple controllers. Apparently developers using Django and Web2Py don't understand they should or can do the same.
If you're good at designing backends and the project is yours, no problem. You'll have to work more and reinvent some wheels but you'll get exactly what you want, no compromises.
If the project is for a customer and then you move on to another customer, do your best to leave a project that the next developer can understand easily. As every Django project can look very different, that's not granted.
On the other side every Rails project looks the same and every project I moved into was well organized. I'm a freelance and my customers call me to add features or fix things. They don't have time and budget for heavy refactoring so those multi thousand lines files are still there, minus some code I managed to extract.
[1] The poor defaults are that a single views.py and controller/default.py are almost all you need to run the app. Hence the 2670 lines default.py I'm wrestling against now. Luckily that was a small intranet backend.
You may be aware of this, but for those who aren't: Ruby has much of this. Perl is perhaps Ruby's other "parent" besides Smalltalk, for good or bad, and there are a lot of stuff stolen from Perl that sees relatively little use in Ruby today.
The "-n" switch adds an implicit "while gets; ....; end" loop, leaving the input line in $_. "-a" will do "$F = $_.split", which means it splits on the contents of "$;" which can be set with "-F" like with awk.
A decade ago I got introduced to Ruby by writing a static analyzer which used abstract interpretation [1] more or less following the hardcore lattice book [2] as taught by the Nielsons themselves.
I used Ruby because my groupmates found Lisp, ML or Haskell too difficult, and Ruby has(had) a very nice Set datatype plus pretty good metaprogramming and DSL abilities.
The code was on Github for some time and it was always funny to see recruiters reaching me, looking for people with > X years Rails experience, as they always equated Ruby to Rails.
It is worth noting that the original reason for Ruby is that 20 years ago there was no scripting language which could handle Japanese text.
Hence a Lisp hacker decided to write a scripting language which could do so using the Smalltalk object model with a library carefully designed to would work a lot like Perl. The result was Ruby.
Blatantly false. Perl4 could handle Japanese text perfectly okay in early 90s. Neither Perl or Ruby supported character encoding initially and you had to rely on NKF/KConv etc.
> no scripting language which could handle Japanese text.
Matz doesn't say this anywhere I can find. He usually says he wanted a more object-oriented scripting language and wasn't satisfied with Perl or Python.
First, you're looking at a feature of modern Ruby that got added in 1.9. Before that, Ruby only supported a single encoding. If you're interesting in the details, look up how the $KCODE global variable used to affect in Ruby <= 1.8.
Second, what do you expect if you have an invalid byte sequence in your UTF-8 string? str.valid_encoding? would have told you that the string contains invalid characters. String#encode gives you fine grained control over how to handle invalid characters in your strings.
... and leaves the responsibility of whether they're bytes or strings, (and which encoding of) solely on the programmer. "force_encoding" is a thing that just should not exist on a string.
Python 3 and Go do it the correct way: array of bytes are array of bytes and strings are {array of bytes + encoding}. Ruby could have done it too by introducing a new type or something but chose to do otherwise in the name of a theoretical backwards compatibility that doesn't exist in practice since every singe release of Ruby introduced backwards-incompatible changes anyway.
> It's documented that the default encoding in ruby is UTF-8.
The complaint is not that the default string has no encoding, it's that proper encoding is not validated when creating the string, but only when processing it (and even then it might depend on the actual operation being applied).
So given (1) a string object with (2) a proper encoding, it may still blow up in your face at any point unless you defensively check every string you get with `valid_encoding?`
AFAIK you are incorrect about Python 3. Python 3 strings are internally represented as ASCII, Latin-1(since 3.3), UCS-2, or UCS-4. So for an arbitrary encoding this statement would be incorrect.
Ruby's model look much more like array of bytes + encoding to me, how else could I change the encoding for example of a latin-1 string into an invalid utf-8 string with force_encoding? Ruby holds off the validation of the string until it's really needed. Lazy validation doesn't strike me as odd for a dynamic language.
I agree that String#force_encoding would be nice if it were not needed but in the real world you get all kinds of broken encoded data. String#encode most of the time is enough but as a last resort, String#force_encoding is there to use.
I'm a bit surprised that you mention Go because if I look at https://blog.golang.org/strings "Similarly, unless [the literal string] contains UTF-8-breaking escapes like those from the previous section, a regular string literal will also always contain valid UTF-8."
This is exactly the same behavior as in Ruby. I put the invalid byte sequence there deliberately and Ruby and apparently also Go don't have a problem with that.
Regarding Go, what matters is that you read/write []byte and transform it to string explicitly.
> in the real world you get all kinds of broken encoded data
> String#encode most of the time is enough but as a last resort, String#force_encoding is there to use.
When your byte arrays and strings get passed around your code soon you don't have a way to validate which one is safe and which is not, which one is a byte array and which is a string.
The thing is not about content, it's about communication and contracts. #force_encoding should not be available on a string, and #encode should not be available on a byte array. If you receive something that may be broken, it should be received into a byte array (b"" in python, []byte in Go), then "cross the gate" to be a string, and that moment is where you eventually perform sanitisation (u"" in python, string in Go). From then on a consumer of your string will be able to assume its content and its declared encoding match.
> I'm a bit surprised that you mention Go [...] This is exactly the same behavior as in Ruby.
Indeed, but you have two types to use and discriminate against, with assorted funcs corresponding to each level of abstraction.
From the Go doc [0]:
> It's important to state right up front that a string holds arbitrary bytes. It is not required to hold Unicode text, UTF-8 text, or any other predefined format. As far as the content of a string is concerned, it is exactly equivalent to a slice of bytes.
I am definitely not concerned about what is in a string. I am concerned about distinguishing between "I just read that bunch of data from somewhere and I'm not yet sure what it is" vs "Ok, that data has been through some earlier process that took guarantees as to what format I expect it to be in". Different abstraction levels. Where Go gets it right is that io.Reader/Writer reads/writes []byte, not string. So you're explicitly acting on your own responsibility when you do:
var b = []byte{0xA4}
var s = string(b)
foo.IExpectSomeSpecificEncoding(s)
So, when to insert a sanity check becomes obvious, as well as relying on the static type system/method dispatch to check things around for you becomes incredibly useful. and foo.IExpectSomeSpecificEncoding can then use the string as an opaque container.
A Python 3 string is not an array of "bytes + encoding". A Python 3 string is a sequence of Unicode code points, and you neither know nor are supposed to care what the underlying bytes in memory are. All operations on strings in Python 3 are in terms of code points: iterating yields the characters corresponding to the code points, indexing yields the character corresponding to the code point at the index, and length reports the number of code points.
If you want bytes + encoding in Python 3, you want the 'bytes' type, not the 'str' type.
This was a shortcut way of summing things up. Indeed I don't care what it is behind the scenes, it's a string object and that's what matters vs being raw bytes that I can liberally interpret or fix. "A Python 3 string is a sequence of Unicode code points" is just what I mean, those are just bytes in memory that are interpreted as Unicode code points by Python, and it's opaquely exposed to the developer as a string.
But that's quite a different thing. In Python, you only got Unicode strings.
In Ruby you got strings that are arrays of bytes that are lazily interpreted as string with the set encoding.
In that regard. Ruby strings are more powerful than Python strings as you can handle different encodings. In Python, you have to work around that with the bytes objects to handle non Unicode encodings.
> Ruby strings are more powerful than Python strings as you can handle different encodings. In Python, you have to work around that with the bytes objects to handle non Unicode encodings.
True, but that's another debate entirely. The counterpoint is that Python's stance makes it extremely powerful as a consumer of strings has a lot of guarantees about the string he's receiving, which is great for defensive programming.
As a Python person, I'd say it's not so much about "power" as it is about correctness: equating "sequence of bytes in an encoding" with "string" is the source of a lot of (potentially hard-to-find) bugs and annoyances. So the approach of saying that a string is a sequence of Unicode codepoints, and that there's a separate type to represent sequences of bytes (plus codecs to handle translating between the two types), is a better way to do things.
And I think this is borne out by the sheer number of articles that have to be written to remind people who use "strings are bytes in an encoding" languages of just how many assumptions they end up making that turn out to be wrong (most of which boil down to expecting bytes-in-an-encoding types to behave the way Python's string really do behave, as sequences of codepoints rather than as sequences of bytes where there may or may not be one-to-one-correspondence between bytes and codepoints).
I picked up Ruby in Japan before the era of Rails, back in 2004. There was a lot of enthusiasm around the language in Japan even at that time; it seemed like people were using it for everything. Personally, I knew a bunch of people who were using it for all their command-line tooling. These days, I bet most Western Ruby devs couldn't even tell you what gets does, but back then it was like magic.
From another comment, it seems like it was invented to specifically support Japanese characters. It might have been the only scripting language for a while in Japan so it makes sense that it became popular over there.
I currently live in Matsue. When I tell people I am a programmer, even decidedly non-computer people will ask "Ruby?" They have Ruby-branded instant ramen in souvenir shops (with the proceeds going to the/some Ruby association). And they have that billboard as seen in the article.
Basically this reads as "Ruby was Python with better Japanese documentation and much better string encoding handling for Japanese use-cases". IIRC they came out at about the same time with about the same goals (correct me please), but I wouldn't wish to make programs that have to handle Japanese text in Python 1.
The default string type is a unicode string since Python3, a special string type for unicode had been part of the language since somewhere in the early 2.x releases (which happened 5+ years before 3.0, in the early 2000s). Not being the default, it of course wasn't generally supported in libraries and still a lot later than Ruby.
Unicode strings were added to Python in 2.0 (released in 2000). What changed in Python 3 is that they became the only string type, with what used to be non-Unicode strings effectively becoming immutable byte arrays.
However, Japanese often don't want Unicode, because of Han unification, and the associated controversy. Thus, other encodings are often preferred, and languages that can handle them in a transparent way have the advantage. It's largely for this reason that Ruby avoided a Unicode string type for a long time, treating strings mostly as raw byte arrays, with a few places that needed encoding supporting them on an ad-hoc basis.
In Ruby 1.9 (2007), they finally added encoding-aware strings; but unlike Python 3, where the encoding is always Unicode, in Ruby the encoding can be anything, and string is bytes + encoding. So you can do UTF-8, but you can also do e.g. Shift_JIS (which many Japanese users prefer). The cost is that it means that e.g. a+b is no longer always valid for two arbitrary strings, if they are in different encodings that cannot be reconciled.
I was quite amused by the design of Oniguruma [1] which is essentially the Ruby's regular expression library. It contains tons of associated tables for each supported character encoding---segmentation, character types, case folding and the likes. I found it ironical that the non-ASCII case folding doesn't seem to work for most CJK character encodings though (in my shallow analysis).
So thankfully, it would appear that Ruby is kept alive by the Japanese.
I have never really understood the craze for Python . I am by no means a guru, but I consider Ruby a far superior language; ( yes, I recognize that the numerous scientific and ML libraries of python make it a logical choice for many projects, but this is sort of like a self fulfilling prophesy)
My love for Ruby declared, I have on occasion wished that we had a "strict" feature in the language similar to ES5 that would permit only the use of Ruby's more straightforward features and also exclude deprecated methods and techniques. This might give the language a deserved boost.
Does anyone know what the most utilized language in Japan is ?
And what framework they use there for web projects, if not Rails ?
Have you looked at the C source of Ruby? :) In terms of that, Python is far, far superior to ruby.
EDIT: to clarify, CPython is much cleaner than MRI, and has a much-better-documented API for extensions. This is what I believe part of the reason that Python has a much better coverage for scientific packages.
Disagree. I deal with the C parts of Perl, Ruby and Python and better VM's on a daily basis. Ruby has by far the cleanest implementation of the VM. Almost like the VM of a good Lisp/functional lang. Python and Perl are the worst of all, on par with PHP.
Ruby is unfortunately also the slowest. But not because of the VM implementation, only because of the everything is a method architecture, even for the simplest primitives. The ruby API is pure and clean, this is what makes it slow.
For the python or perl API too much inner state needs to be handled, which makes it slow also. Because you cannot use a better design for data and code. Only php made significant advantages there recently, so you don't have to worry about inefficient data and code, only about the eye blead VM.
I don't know if I can agree that the Ruby C API implementation is the cleanest -- from my experience I'll say it's very pure, but pure does not really mean clean, and it sometimes (esp. when performance is a concern) makes writing C extensions messy.
Python's C API could be argued that it uses too much abstraction, but it definitely makes it that much easier to write extensions.
Python is slightly less powerful but is pretty fine for many tasks, and much easier to learn than Ruby, with a consistent syntax. So you can quickly learn it and then spend the rest of the day focusing on getting things done.
I would disagree.
For instance, there are at least two syntaxes in use to describe hashes; parentheses are optionally (and inconsistently) used and so on.
Rails suffers from the same issues as does the language
How is Python less powerful ? They both have metaclasses, monkey patching, first class functions, unicode... Ruby has blocks, Python has decorators. You can solve pretty much the same problems with the same number of lines.
I should preface this by saying that I prefer Python for my projects for a few reasons, however I agree with parent that Python is less powerful than Ruby.
Both languages have metaclasses but Ruby's are simpler to grasp and more fluid in how they're applied because they're more run time oriented rather than instantiation time oriented like Python's. In fact Python in general tries to make more of a distinction between the instantiation and runtime environments making it a bit more restricted. All this means the average Ruby coder tends to use metaclasses almost everywhere and certainly a lot more than the average Python coder.
Ironically even string processing with encodings is harder in Python 3 than Ruby 1.9 (these are the versions in which both languages modernised their Unicode support). Python's insistence on Unicode everywhere is problematic in places like Japan where there are still extant competing encoding standards that are not fully recoverable when doing a round trip via Unicode. Even ignoring Japan though, Python has had to put in all sorts of hacks to deal with corner cases like the fact that command line parameters or file system names don't always have to use Unicode encodings. 95% of the time you can be ignorant of all this in Python and stuff will just work... but it's a massive pain when you're dealing with the other 5%. I genuinely still don't know if the Python 3 or Ruby 1.9 approach is better... Python's is purer and initially I liked it better conceptually... but having had to apply it practically I'm no longer so convinced.
Finally, I assume you weren't comparing blocks and decorators as somehow equivalent features. It might be worth adding that it's pretty easy to add syntactically pleasing decorators to Ruby without needing special built in syntax, by leveraging Ruby's metaprogramming capabilities. It just tends not to be used as often in Ruby because there's usually different ways to express the equivalent concept.
But while on the topic, blocks are probably the killer feature that makes Ruby more powerful. They essentially allow you to easily change the context of an individual method call in very arbitrary ways. Yes you can do it by passing in named functions or even function pointers in C, but none of this is as syntactically coherent as blocks in Ruby.
If you want a practical example of how this makes Ruby powerful take a look at Rake. It's basically a Make equivalent written in Ruby. The nice thing is that you never leave Ruby, all you do is add a "require 'rake'" at the top of your Ruby file and you now have enough DSL machinery to be able to write pretty nice rule syntax without ever having to write a parser for a custom language (and leaving the power of Ruby behind).
I don't expect any of this to be particularly convincing. The difference in power only becomes obvious when you start writing idiomatically in both languages. When I first started writing Python I tended to write it more like idiomatic Ruby and it was not pretty. Overall Python and Ruby are at a fairly similar level to each other but Ruby definitely has a bit of an edge in terms of expressiveness.
Python is arguably more "obviously readable", in a sense that when you see some construct, and you don't know what it is, but you have a guess, that guess is usually correct. It makes the grammar more verbose than Ruby in many ways, and idiomatic Python is almost always more verbose than idiomatic Ruby, but for many people, ease of reading and understanding is preferable.
It's a double-edged sword. The code's behavior is apparent in that simple example, but things get considerably more difficult to understand when you're dealing with monkey patched methods and dangerously clever method_missing magic in non standard library code.
Yeah, I had to deal with a perl port from a ruby library that had kept most the rubyisms in. The really disturbing thing about it was the Frankenstein levels of autovivification involved. Fortunately someone (not me) is going to get paid deal with that soon.
You can certainly find snippets that are simpler in Ruby than in Python; they're just not the majority. Consider a slightly more complicated case where you need to print out items in a range with some modification. Idiomatic Ruby:
(0 ... 5).each do |x|
print (x + 1)
end
Idiomatic Python:
for x in range(0, 5):
print (x + 1)
To understand Ruby, you need to know what || does, and the overall sequencing of the words as you read that construct is somewhat unnatural for English. Python is comparatively easier to parse with no prior background.
It is definitely possible to write Ruby in a way that's still highly readable. But from what I've seen of real world Ruby code, that's not the way it tends to be written - idiomatic Ruby values expressiveness and DRY, so you see blocks and various terse syntactic sugar used all over the place; lots of "clever" code in general. There's nothing wrong with that - I generally prefer expressiveness over readability myself - but many people do not.
I see your point. But I'll take the liberty to use part of your comment as my own response: "You can certainly find snippets that are simpler in Ruby than in Python, and vice-versa". Now that is something I 100% agree with (I added the part after the comma btw).
Basically what I'm saying is that the differences are very minor compared to differences vs, for example, other languages (Java, Scala, C++, etc). So from that perspective there's not much to complain about. I just have a slight bias towards Ruby, I find it a little more readable (in a small way).
While it is true that with ruby you have to know more syntax, without previous knowledge of either language you caught on pretty fast.
To me it is more expressive of my intent to say "0 to 5, for each, do something" than "for something in the 0 to 5 range do something"
Also you happened upon what peeves me most about Python, you are not actually looping from 0 to 5 since range is inclusive at the start, but exclusive at the end. So while (0..5) gives you 0, 1, 2, 3, 4, and 5, range(0,5) will get you just 0, 1, 2, 3, and 4. Of course it's easier if you think that range() is rather "I want to take X steps", so if you just want to step 5 times range(5) does exactly what you need it to do.
It depends on what you're trying to do. It's pretty common with strings to ask for "everything excluding the first and the last character", for example, and then you want the end index (counted backwards), not length. Another common case is when you looked up a substring and got an index, and now you're slicing the string around that index.
But there are also many cases where length is more desirable.
Ditto with ranges - both end-inclusive and end-exclusive ranges are useful in different cases. I kinda like the fact that Ruby lets you choose, but I think that .. vs ... syntax distinction is too subtle and likely to cause bugs.
I kinda like the way Nim does it: .. for inclusive, and ..< for exclusive - e.g. 1..<10. They also use special syntax for count-from-end, instead of negative indices, which is also a good thing IMO (the decision to index from start or from end is usually something that's not going to change at runtime; but with negative indexing, it might inadvertently do so if you underflow when computing the index).
Care to elaborate with an example? I use Python day-to-day and did quite a bit of Ruby a few years ago, but I fail to see significant differences in verbosity.
What I do see is that Python is somewhat less consistent and readable (list comprehensions beyond the simplest ones are unreadable, Ruby's blocks allow cleaner DSLs, len(a) vs a.size, little things like "3.times", allowing "?" and "!" in identifiers). But Python has a bigger and more diverse ecosystem. So although I would prefer Ruby over Python, the ecosystem aspect makes the point moot.
I cant , and could never understand why this is, and if it is a problem.
Github, Basecamp, Twitch, Stripe, Square, AirBnB, Shopifly, Cookpad, and many more. All these are very successful site that are using Ruby ( Not necessarily Rails ) and most of them profitable.
If each or them could fund a 20-33% Salary of an additional Core Dev you instantly have 3 - 4 people working on it Full time.
So I have been thinking on this for sometime, It is because the Ruby community and the world as a whole does not want to donate? Or are we lacking someone to push this and ask for some help when needed?
Especially interesting when you compare to Python, which is used extensively by Google, Dropbox, Instagram, Pinterest, Reddit, Youtube.
Of those I know both Dropbox and Google have employed Gudio Von Rossum, and various other Python core devs are also employed full time.
Overall it seems that Python is much more community driven than Ruby. Python is interesting in that it has several big camps, including web dev, systems programming/automation, scientific usage, machine learning, etc, and all of these communities are fairly self sufficient but also contribute back to the main language.
Well that's exactly what is happening these companies employ people who work more or less full time on Ruby core issues. A lot of Ruby features are funded by big corps. The 3 people are just the 'core' core team. Like Linus did for Linux for years
It's one small thing but string interpolation annoys the crap out of me when using Python and while it might tick the understandable check box I wouldn't say it's easier to read. After using Ruby it's such a chore to read or write in my opinion (which admittedly doesn't mean much having so little experience with Python).
By the glare of it, none. But you have to understand that ruby has that `#{}` since ways before, and python has just added that `f` syntax recently. Also in ruby it's the only way of doing string interpolation, while in python you have % and other methods.
You actually want both of these. The reason why explicitly indexed placeholder syntax is still important, even if you have interpolation, is because the order in which the arguments are referenced may change when the string is localized.
OTOH, f-strings were added to Python not long ago, and they are the third or fourth (depending on if you count `string.Template`) syntax for string interpolation in Python. Frankly, string formatting in Python is a bit of a mess right now.
I used to have a preference for Ruby but Python has a single killer feature missing for me from Ruby and that is explicitly imported module symbols.
When looking at new unfamiliar code in Ruby there's just no way of instantly knowing which symbols come from which files/gems. There are conventions, sometimes these are followed well, sometimes not. Either way it adds a lot to the mental load of parsing new code or even old code you've written your self. Ruby's module import system is just too much like C's #include.
Yeah, I've come to think this is perhaps Ruby's biggest design miss. It can be really hard to trace dependencies in Ruby, and doing so is a prerequisite for any deep understanding of what's going on. (Granted, this is mostly a problem in Rails projects, but the fact is that Ruby makes this problem possible.)
Yes, Ruby is my favorite language but they really messed up the module system. Ruby basically has no module system at all and everything is in the global namespace.
I mean, that's not technically true, there's definitely a module system. In fact, in some ways Ruby's module system is more powerful than Python's since it's completely divorced from the concept of the file. You can split a single module across multiple files or create multiple modules in a single file. You can nest modules as well of course.
The problem is with the module import system, which as you alluded, imports the module structure in a given file into the global namespace. And it imports everything! There's simply no way to say which bits you want and where they should go.
I actually don't really see a reason why the import system could not be changed while keeping the module system in general as is. I wonder if anyone has ever tried something like that...
Dynamic types, OO with some extension for functional style, inherited Algol syntax with some homegrown innovations, batteries-included design. Where they differ, it's largely in whether they bend towards consistent style(Python) or more late-binding power(Ruby).
Cannot agree. Ruby loves its MOP, python loves its method overloads.
Both are highly magic, but a MOP is technically superior to overloads, because you just flip the vtables and don't have to look at every single method call, if some python programmer decided to overload it. Perl has no builtin MOP, and even worse overloads than python.
but the ruby community takes modules to magic extremes. Just take the overload of include vs. extend as one example. Some modules don't even have any reasonable error messages or try to detect when someone stomps on their names or data. It's a madhouse with ruby modules that try to be "magic" or "easy", which is never "simple".
If you ever find yourself pulling back the curtain of many ruby modules, it's time to dig in and swear as you go.
I'm Canadian. Ruby was my first programming language. I don't know Rails. Now that I know Python, a lot of my first programming projects are the kinds of things that I would use Python for nowadays.
American audiences often ask, "why would you want an embedded Ruby?"
You might as well ask "why would you want an embedded Python?"... the answer to that is fairly obvious to anyone who knows Python and C, and who has done embedded programming. So why is it less obvious to those same programmers why someone would want an embedded ruby?
Last week there was a big thread talking about how Arduino was extraneous overhead and we should be using the AVR libraries directly. I think they would have died upon hearing about running a garbage collector on a microcontroller.
The lines are rapidly blurring, given that you can fit, and power, an ARM core sufficiently powerful to run Linux (actually, you can run Linux on far smaller systems than that) with a WIFI interface and web server and run Perl on it on an SD card form-factor device... 4 years ago [1]. Of course there are plenty of niches where you still need a much lower power device, but a lot of areas where you needed that a few years back are areas where you can put a much bigger stack now and still do ok.
Arduino and AVR are 8-bit microcontrollers. ARM chips are 32- or 64-bit CPUs. There are plenty of ARM SoCs out the which can run Linux off of an SD card: raspberry pi and beagle bone black come to mind. There are plenty of use cases where an AVR chip is underkill (e.g. anything that needs parallel processing) and an ARM chip is overkill (e.g. digital thermometer).
> There are plenty of ARM SoCs out the which can run Linux off of an SD card: raspberry pi and beagle bone black come to mind.
In case you misunderstood: The above is not about running Linux off of a stored diskimage on an SD card. The ARM SoC in question fits in an SDcard. They're intended for e.g. putting in your camera so you can have wireless, wifi access to the images on the SDcard.
Other than that I agree - there's certainly still room for both. But the point is you can physically fit a Linux capable computer in an SDcard form factor, and drawing 100mA, which means the niches where smaller micro-controllers is the only alternative have narrowed accordingly.
This is mostly orthogonal, but I'm yet another dev who loves using Ruby, but absolutely hates Rails. Ruby is a joy to use, whereas Rails is soul sucking, anti fun and overrated.
I haven't been paid to write a line of Ruby in over 4 years but I still use it for the occasional hobby project. Not only do I not use Rails for fun, I'd have to be close to starvation before I used it again for profit.
Now I feel the urge to dust off my copy of 'Eloquent Ruby' and write some fun code.
The difference between front and backend web applications is nothing more but where the views are rendered - which on client side also means managing state and IO.
This means Rails and Angular are doing the same job - the main difference is where data keeping and rendering is done.
Both Angular and Rails talk to "a backend." In Angular's case it's (probably) an HTTP server handing over JSON representation of data. In Rails' case it might well be exactly the same thing, or it might well be a NoSQL database handing over JSON representations of data, or it might be an SQL database.
You can certainly make distinctions between those data stores, but they have enough in common it's perfectly reasonable to say they're doing many of the same things.
Now, Angular is considerably more horrible than Rails and tries to solve some additional front end problems badly, and its use is a likely indicator that the organization or individual that have decided to employ it aren't capable of making wise technical decisions, so I'd say it's also reasonable to make distinctions between the two on a number of levels. But there's considerable overlap in the kinds of problems they attempt to solve.
The difference is Rails can handle both the front-end and backend of a typical web application, whereas Angular is just the front-end. Equating Rails data store backend to Angular's service backend is a poor analogy. The concepts and principles behind web browser client code, server-side application code, and data stores are 3 completely different engineering paradigms.
> The difference is Rails can handle both the front-end and backend of a typical web application, whereas Angular is just the front-end.
Think about these questions: is Rails really "behind" everything involved in a typical web application request/response cycle? Or is there something else behind it? In what situations is Rails not a client? For the situations where it isn't... how common is that setup, and which situations could Angular not cover?
(There's a few, but I suspect they're vanishingly rare.)
You may not have been doing this long enough to remember, but there was a time when "front-end" meant something different than it does now, reasonably so, and there's still some circles where people use it to invoke the layer of an application that generates HTML in back of an HTTP server because of what it's in front of.
> Equating Rails data store backend to Angular's service backend is a poor analogy.
In both cases, you have a given runtime opening up a socket connection through which some protocol exchange negotiates a request-response sequence to a server which provides some representation of application data. What, specifically, is the "poor analogy"?
Sure, everybody understands that the runtime in question is different and execution takes place in different environments. And like I said in my earlier comment, you can make distinctions between types of data stores (though given how common JSON responses are these days, most database developers seem to be working really hard to minimize those distinctions). You can make (some) distinctions between the demands of a multi-request environment and a browser's (likely) single user environment. It doesn't change the fact that there's a reasonable level of consideration at which Angular and Rails are doing the same thing: mapping view actions to controllers, running control code to update/retrieve model state including hitting data sources over the network, and updating the view. An analog doesn't have to be perfectly isomorphic in order to be useful or true.
Now if you really want enlightenment, ask yourself what problems the application of those patterns is meant to solve, and how effective Rails and Angular are at solving them. Also, what parts of the system are "in front" of the browser?
Or don't, I guess, and insist that labels like front-end and back-end represent disparate iron clad categories of platonic computing ideals with an insuperable gulf betwixt.
You're touching on my main beef which is that a service that you run for others on your servers requires a massively different mindset from a client-side app that you distribute to run on others' hardware. Do they use a common flow control pattern? Sure, but the performance and security requirements mean you have to think about them completely differently. I'd consider an iOS app is more akin to an Angular app than a Rails app is.
Does that mean I'm trying to put a stake in the ground on the canonical definitions of front-end vs backend? Hell no, it's totally relative, I'm only commenting within the context of Angular vs Rails so don't read too much into it.
LOL, where to start? 1) There is such a thing as a web app which doesn't run Javascript. 2) Rails can generate and serve JS. 3) Rails was the first web framework with AJAX integration back in 2004 via PrototypeJS, this is the library that kicked off the entire movement that gave us the modern JS world we have today.
People used to write websites and web apps long before Angular and single page apps came about. I remember a big part of the beginning of my career it was perfectly standard to render HTML server side. All the templating would be done server side and you'd serve dynamic HTML to browser. With usually some small JavaScripts in HTML for client side.
Even today not everybody uses SPAs. Lots of big old school websites generated dynamically on server. Many Django and Rails projects built that way. Also lots of legacy Java stuff and PHP websites (all Wordpress and Drupal sites out there for example, and there are millions).
A Rails app is literally one or more Ruby processes whose task is to respond to HTTP requests/WebSocket connections. Yes, these processes will probably make use of other services (e.g. a database server).
An Angular app is a JavaScript file that runs in a client's JS environment.
I don't see how there's considerable overlap here?
Rails was designed at a time before angular brought the SPA into fashion. SPAs manage state and template rendering on the front-end, so much of the Rails machinery becomes unnecessary. You can still use Rails to implement your JSON API backend, but the simplicity of something like Sinatra is often preferable since you're increasing the complexity on the front.
The node+react+webpack combo gave web devs the power to process application states and render the UI on the front-end and the back-end using the same code; a feat that is not possible at all with Rails pushing it a little further out of fashion.
Of course, the SPA suffers plenty of criticism, and Rails is still a popular choice with and without an SPA on the front-end, but I think it's fair to say that Rails is no longer the "fun" way to do things (though the most fun doesn't mean the best business or engineering solution).
I agree with all of the major points of your comment, but I have a quibble:
> Rails was designed at a time before angular brought the SPA into fashion
I dunno that Angular is the relevant point here. Rails (late 2005? I think?) post-dates the public existence of Gmail (public April 2004), a wildly popular SPA. It also post-dates the existence of Google Maps (early 2005). The famous blog post by Jesse James Garrett that coined the acronym "AJAX" is also from early 2005.
The thing that Rails does well was already largely-obsolete before Rails was open-sourced. (Though that is one state-of-the-art buggy whip! Very impressive!)
I absolutely agree that the SPA was already a thing before angular, in fact, angular was specifically designed to wrangle the general design concerns related to building SPAs and doing so sold the distinct concept of the SPA to many developers.
If you want something comparable (feature wise) to Rails, then check out Hanami. It's much better designed for large apps. If you are looking for a micro framework, I'd suggest Roda. I'd characterize those two as the modern replacements for Rails/Sinatra.
For some reason, I could never get the hang of Rails. But when I found Sinatra, it all made perfect sense to me.
Now I exclusively use Padrino [0] as the framework to build all my web apps. It is a framework on top of Sinatra that gives you a lot of the Rails goodness, but allows you to be really agnostic in terms of ORMs, Testing/Mock frameworks, rendering platform etc.
I've done RoR stuff and something about "doing-things-the-rails-way" bugs me to no end. After being paid to work on a RoR app I started to hate Ruby. Some time after I wrote something small and silly for my thesis (to visualize some data and give my classmates something to play with) in Sinatra and it rekindled my love for the language.
Something I can't really put my finger on just rubs me in the wrong way in Rails, but using Sinatra felt like shattering chains that were holding me back. Maybe it was partly that I was learning NodeJS at the time, I dunno, but I can't recommend Sinatra enough. This being said I have zero experience with Sinatra in an production environment, so take this anecdote with a shovel(s) of salt.
Similar here. I've used Ruby since 2005, but mostly for backend code, with a sprinkling of Sinatra for my web projects. Sinatra is spartan compared to Rails, sure, but it also means you get to bring just the components you need, and Sinatra is tiny enough to just read the code if you don't understand exactly what's going on. In fact, before Sinatra, I had a couple of projects where I used Camping - _why's micro-framework for the same reason: It brought just the bare minimum, and you could hold all of what it did in your head with ease.
I don't quite have the same hate for Rails, I think. If a client asks for it, I'd consider it. But I also usually tell them I'd rather refer them elsewhere.
I am American, and remember the first I heard of Ruby was in '99/2000-ish. I didn't get into it but it sounded interesting. It was presented as a smalltalk-like language with good interop with C, or even objc for OS X apps. Pretty similar to the niches that Python fills.
Then it seemed like one day some years later, a very faddish, I will be blunt, charlatan type seemed to dominate that discussion. This is the type that confuses ruby with rails, git with github, the internet with the web. They are using ruby in the same way that the old ASP crowd would have used Visual Basic.
I was never involved in any of this, just watching from the sidelines, but it does seem like a bit of a shame.
Ruby definitely owes a big part of its popularity and awareness to Rails. I think all Ruby committers admit this fact.
That said, Rails is hardly the only framework that has brought Ruby to the communities outside of Japan. Chef/Puppet/Fluentd played a key role in bringing Ruby to the ops world while books like "Understanding Computing" by Tom Stuart demonstrates Ruby's potential beyond web programming.
There's a tangible tension between Rails fanatics and sans-Rails Rubyists, but they have needed each other to get where they are today.
Even in Japan PHP and Java are more popular than Ruby. As per a Japanese headhunter friend of mine who works for a big placement agency there there are about 3 times more PHP jobs & 6 time more Java jobs than Ruby jobs in Japan.
reply