Intel attempting to make a play in the maker market with their x86 cores seem attractive to many software developers. Partnering with arduino was a good move for usability but the boards that they were competing with were players like Beagle board and Raspberry Pi. Realistically they could try to be in third place but to unseat ARM boards proved to be too difficult.
intel IP just doesn't even offer anything in that space. who cares about x86-compatibility when you're doing new firmware? it's a horrorshow compared to what ARM provides.
i don't need AVX extensions or whatever, i need sane and plentiful interrupts, a reasonable memory model, simple IO, etc.
Was Intel ever seriously in the maker market? As a maker, I feel that it was kind of an afterthought for them. The Edison looked great on paper, but the price was wrong, the tooling was inadequate, and the whole attempt felt like an enterprise was just trying to cargo-cult its way into some marketshare.
It never made sense for Intel to enter this market to begin with. Opportunities that move the needle for a company like Intel come in sizes of 100s of millions $ or more. The maker market is just not that big. "IoT" however is - from industrial to home automation and everything in between where sensors, controllers and networks get added to traditionally analog or un-networked technology.
If you look at what Intel was selling, that's the market that Intel's products were actually geared toward. Look at the price, specs and form factor of the Edison. Now imagine that Intel was really trying to push into the IoT space, it is a good play considering they failed to break into mobile. These products were really competing against companies like Phytec, Variscite, Compulab, Toradex, Myir and Olimex who make ARM-based modules for embedded networked products.
Why they pushed so hard to market to the maker community instead of commercial IoT is sort of beyond me. Marketing to makers helps gain mindshare and familiarity with your product (but x86 is already ubiquitous.) For comparison, I wonder how much Atmel's bottom line has been affected by Arduino, or RPi for Broadcom. Why Intel couldn't get their act together to make a well-supported compelling product for commercial IoT applications is baffling. That's where they really failed.
I feel like Atmel and Broadcom will see more of an uptick in sales as makers finish college and get jobs doing actual engineering. If you've been building things using Atmel chips in your spare time they'll be familiar, you'll probably reach for them first when designing something for a job, and one product using a chip can lead to millions sold. Even if only a small handful of engineers converted to using their stuff instead of PIC/STM/TI/etc due to the Maker movement it's still potentially a huge payout. And considering that they didn't even pay to create Arduino it's even better!
I sense this from AT&T as well. At IoT World they had a booth with "Cellular enabled IoT Devices," but they couldn't even explain to me how I put code on a device, or how I could scale a device up to production.
Yep, 8 bit 8MHz microcontrollers are already plenty of power for a lot of hobbyist applications. And if not there are plenty of super cheap (<$5) boards with ARM 32 bit processors, or wifi/bluetooth built in
Once you make your 'x86' incompatible enough that it won't actually run glibc fully, you've negated the whole point of having x86 in the first place.
Intel's "strategy" with Quark etc has been to to sell you on the idea of a ubiquitous platform, and then provide a niche platform that looks "mostly similar" to the ubiquity you were promised.
As someone who spent a bit of time trying to start a project based on the Edison, the most frustrating thing was the amount of bugs and the lack of documentation to be able to do anything about it myself, leaving me hopelessly waiting for them to release fixes.
For example, after several months of terribly slow and buggy SPI and no fix over multiple releases, I finally switched to ARM and am very glad I did. Intel did finally fix the SPI issue about 9 months after it was first reported.
With ARM, I had plenty of issues and challenges, but had the documentation and resources I needed to be able to fix things, as well as a better support community.
One of the key issues in the Intel support communities was a growing lack of trust, now confirmed by Intel dropping out. It takes a big commitment to really understand a system, and the nice thing about ARM is that the community goes beyond a single company, so a company dropping out is not as significant as in this case with Intel.
For what it's worth, I think it's an intel trademark to have bad or lacking documentation. Even projects that should have stellar documentation like intel TBB or MKL have fairly cryptic docs
They somehow think "Intel Inside" marketing strategy worked (not that Wintel monopoly happened) and they think that they can just "Intel Inside" their way into any market, and somehow make Wintel-like margins even in b2b markets that are fully commoditized, not need to put out competitive parts or reduce adoption costs with things like documentation.
Indeed. I attempted to do some node.js prototyping (since it was one of the directly-supported environments, supposedly) -- but they never pushed any sort of update, such that one was stuck on a (by node standards) ancient version.
I'm not very surprised with this. Quite a long time ago I came across their Galileo boards but for the price point it just did not seem worth it - 2 - 4 times the price of a RasperryPi or Uno.
The appeal/benefit of having x86 was just never there for me or the agency I was at. I could see how it might be useful if you are writing a lot of assembly and low level at that but this seemed too much of a niche with the way they were marketed. If it had networking and I could use it with Python I was happy.
Also, with Arduino you can easily and cheaply buy more ATmega328 DIPs and put them on protoboard for an embedded project.
I built a little server rack temperature monitor (which collects data from a couple sensors and provides a simple REST service over ethernet) that way, all you need to run the ATmega328 by itself is about 50 cents worth of components.
x86 has two things going for it: Microsoft Windows x86/AMD64 compatibility, and highest possible performance (not performance per watt; absolute attainable performance). If you don't need either, it is expensive in cost, power or both. And in the IoT/Maker world, one rarely needs computing power on the device itself (and Galileo doesn't really provide a lot of computing power), or wants to pay for a windows license per device (and deal with activation, etc)
I work on IoT projects professionally, and we see a large gap between makers-grade hardware and production-quality systems. Every week we talk with startups and enterprise customers who start off on something like an Arduino, only to hit a wall because there is no good way to take it to production.
IMHO, the market is ripe for a hardware/software platform that bridges the ease of Arduino with a path to production. A bunch of the silicon vendors are in this space, but they offer weak solutions, and things like AWS IoT are really bad on the hardware side.
Provisioning more than one ethernet PHY is an issue. There aren't a lot of choices for two or more PHYs at a reasonable (singe $ digits in medium quantities) cost
* Cost. Maker-grade stuff costs 10-100x what a production embedded does. You're leaving a lot of money on the table.
* There's no guarantee of long-term support; RPi and Arduino both regularly change form factors. They make no guarantees that they will continue manufacturing what you're already committed to.
* Maker vs. production system architectures are totally different. Linux is rare among production systems. It just needs too much hardware (see Cost). So you have to rebuild the thing anyway when cost-reduction time comes.
'Production' means different things to different people, of course. I'm thinking of a world where you want to ship 10k+ units of a device. If you're only shipping 100 units, your concerns are very different.
Big one for me is shape. A lot of the stuff that I work on has to fit in pressure cases that are already pretty cramped. That means that there is no room for a bunch of breakout boards and whatnot.
The Arduino itself might cost 10-100x of something that could go into production, but couldn't you prototype with an Arduino and use cheap Atmel chips with the bootloader installed in production?
You absolutely could. But many, many startups suck at this step, and get stuck at the "systems integration" phase, able to connect purchased components like Arduinos to other purchased components with wires, but unable to transition to purchasing the raw parts and connecting them using printed circuit boards.
I'm also pretty confused here. Are the "startups" in the ancestor threads more like college hackathon projects? Like I put a raspberry pi in my toaster and know nothing else about embedded systems, now how can I sell a million of these?
Yes, that tends to be how startups come off - even the "sophisticated" ones. They tend to have minimal knowledge of the steps necessary to turn their prototype into a mass produced thing: DFM/DFF, supply chain management, identifying/negotiating with component/bare board suppliers, etc. Generally speaking, they need lots of hand holding through the entire process, and that makes it take way longer for everyone.
The nature of stateside electronics manufacturing doesn't help startups much in that regard, in that shops are kinda either set up for NPI and rapid prototyping or not. Sierra Circuits is good for that, but idk how well they'd fit with a startup budget. Beyond that, the low-complexity nature of most IoT products means they're more cost effectively manufactured in China, as most US shops focus on low to mid volume runs of high complexity boards, as opposed to high volume low complexity runs. Figuring out how to manufacture in China can be a big obstacle for any company, especially smaller ones.
The difference is that in software, you can go very far into production with an AWS account and a few blog posts on devops. There is no need to hire a specialist in wiring server racks or negotiating traffic peering deals up front.
All the IoT boards have made prototyping hardware accessible to "software people", but unlike AWS there is no smooth scaling curve from 10 to 10k to 10M customers.
Yes, but I'd like to add to other's points that unless you get your hardware architecture juuuuust right, you might start getting bugs in your code as you make the transfer. Furthermore, there are difficulties if you don't judge properly your run of production, need to have another go, and find out that maybe some dumb little module is no longer available in exactly that forma, causing you to use a slightly different one, causing another fracture in your code base. Now you're supporting two, and developing on a third, etc. I've never experienced this, but it's come up at iot conferences as I talked with folks.
Some Arduino IDE-compatible boards are pretty small, like the Teensy[1] line - though if you needed to cram other boards in with one it'd still be cozy I imagine.
In the case of teensy you've pretty much got just a chip with breakout (and maybe an FTDI chip) - it's pretty much a reference design. At that point you'd just board-down that entire layout, there's little benefit to buying a module with a mini USB connector that might or might not be in a usable position and high-pitch headers that take up a lot of space.
But if you start off on something like an Arduino, you'd "just" convert it to a custom board running an AVR (or ARM with the newer ones?) on the way to production, or are those that bad to use in serious designs?
It's more a problem if you start on something big like Artik/RPi/Edison and try to downsize later. You'll find yourself locked in to that vendor's software stack.
Or if you try to ship 10k devices with whole RPis inside.
These are mostly true, but it's pretty trivial to design your own Atmel AVR board.
> Linux is rare among production systems.
AFAIK, Arduino has no OS, let alone Linux. It's a giant wrapper around "setup(); while (true) loop();" Which is itself a gigantic problem for power savings. Fortunately there exist alternative platforms, and some even run on the same Arduino platform: https://github.com/jmattsson/tinyos-arduino
Personally, I like FreeRTOS. I got an ESP8266 and starting writing programs right away. There's no VMM, no true multitasking, but it's better than raw arduino. I managed to port Zork in an afternoon.
An arduino is a microcontroller with a few conveniences added. There are plenty of microcontrollers inside of devices that last for years on tiny coin cell batteries. You can program it to go into an ultra low power sleep mode and wake the instant you press a button or on a predefined schedule, do whatever business you want it to do for a few moments, then go back into ultra low power sleep. Some microcontrollers are more optimized for this use case than others, but plenty of ordinary ones are and they've gotten incredibly good at it.
You can use the standard AVR sleep modes and interrupts on Arduino. Put the uC to sleep and have it wait for a pin to change or a timer to elapse before waking instead of idling at full power. See the Arduino docs (or the Atmel datasheets, which are easy to read):
There are libraries that do this automatically on Arduino too, allowing you to schedule [cooperative] multitasking and sleep the uC between tasks. E.g.
is really good, I've used it before. You basically queue up a list of task callbacks and a schedule in your `setup()` and then do a call to `tasks.execute()` in `loop()`, which pops off the next task that is due in a queue or sleeps otherwise. It's simple, but much more straightforward than manually using `if millis() - last > delta1... else sleep()` and not as rigid as using the timer ISRs (which really serve a different purpose).
On more complex platforms you can also use an RTOS, which is kind of like a more beefed up version of this model. Actually you can do this on AVR too, but I haven't ever seen anybody actually use FreeRTOS/ChibiOS/whatever on AVR.
Bear in mind that while what he said is correct, there are other concerns on an assembled board, as opposed to the bare chip. Arduino has a voltage regulator that is always consuming current. The Power LED is always consuming current. You should be sure to set pin 13 (the indicator LED) as an output and turn it off, since even as an input, there will be leakage current through the LED. The other active components on the board have their own quiescent current levels, etc.
You can get an AVR device to run on a coin cell for months or years (I have a design that does just that). You'd have to modify an Arduino significantly to do the same.
Very true, though there are some better Arduino variants. The Pro mini is the canonical "low power" Arduino as it uses a very efficient LDO regulator. I also use bare ATTiny's for a lot of uses, they don't require any passives and can run directly from a LiPo as long as you use a protected cell or connect a feedback line to the ADC.
A lot of maker stuff is either bog-standard parts that you can buy in quantity off AliExpress, or open source with all designs and layouts freely available. Software'sopen source also.
Using a whole Rpi 3 as part of a solution? Yes I agree that there's a risk in supply; but if making 10k devices, you are talking about a raw cost of at least 300k USD, right? So spending 30k for a few months of engineering time to make it production ready (idea: make sure that the OrangePi knockoffs also work, so you have 2 sources of supply) is feasible in the case you mention.
Linux SoC based designs are fairly complex in terms of engineering cost - if you're only building 10k of something, you might in fact be better off using something like the RPi Zero or Compute Module (or another system-on-module, like Variscite or etc.) rather than trying to integrate the Broadcom part yourself. Not to mention that RPi is getting much better prices on things like the SoC and the HDI PCB than a startup producing 10k of something is likely to get, so it may even be a wash on unit cost. And the supply risk is likely manageable; 10k is a high enough volume that it would be foolish to rely on distributor stock/spot prices for components, so you're already worrying about leadtimes; I don't see why a wholesale contract with RPi would be that much riskier. (Yes, it's true that the leadtime would increase a bit since RPi needs to purchase components themselves and convert them into modules.)
Now if your product is Arduino based (or another microcontroller platform) the barriers to doing a fully custom design are a lot lower, the markups on the Arduino board are a lot higher, and so the cutoff volume for where it makes sense to do a custom design is going to be much lower than 10k.
I'm a little confused here too; obviously an arduino-inna-box is not a proper solution and for production run you should design your own board with an Atmega328 on it, so what's unsuitable about the Atmega?
Indeed, there are a few companies that can help you with this[0][1], even bigger ones like TI. It is somewhat expected that you are driving this, though. There are plenty of resources on the internet about every step to production.
Thank you for the links. I work in this space professionally. I had heard about Brillo and Android Things a while back (there was very little info back then). Looks like there is a lot more good info now. Will delve into it. By the way, if you are in Mountain View and want to grab coffee, please reach out (email in HN profile).
I am very curious to see how this pans out - I badly want something like this to succeed. It will be so much more better for someone like me who works in this domain. Google has the weight to pull this off- and it looks like the SoM approach might be the right way forward (Particle, Electric Imp are good examples of managed hardware+software)
Having said that, there's a bigger market for micro controllers imo - which probably doesn't fit in to this strategy at all. There used to be Weave (RIP- which also makes one cautious about Android Things) - and there's Zephyr (kinda similar, but not really) - but no clear winners here yet.
IMO, the biggest problem is software startups unwillingness to hire other types of engineers.
It's easy to bridge from Arduino to Atmel AVR IF you know how to do board layout. Startups need to either grow their knowledge of electronics or hire some electrical engineers.
The AVR documentation is excellent, you could easily design your own board if you have anything beyond rudimentary electronics skills.
A lot of IoT startups are dependent on attachment "blades" for interfaces. They get burned because a lot of their value proposition and profit is tied up in some other guy's blade they don't know how to build.
Unwillingness to hire is related to cost, which is a supply/demand thing. There's many more software engineers around. Layout of high-frequency electronics has many gotchas, and for devices with wireless communication you need to get your design certified too. Getting high-volume discounts as a startup is also hard.
I think system-on-module approaches like Samsung's or Android Things' solve those problems, though.
Arduinos powered by AVRs aren't high-frequency. The highest-frequency external component might be a 20mhz crystal, which you can lay out just by following the example design in the data sheet.
Most ARMs up to even a few hundred MHz are also pretty easy to lay out.
For wireless you're right, but this is more or less solved by the recent proliferation of cheap plug-in wireless modules that are already certified.
Using a module or not, you still have to go through unintentional radiator testing, though, right (if you have a clock signal on the board over some ludicrously low frequency)?
I really don't get the point of Android things. Looking at their website the most constrained device they support is on the order of a raspberry pi. Why wouldn't you just go with Linux/Raspbian in that scenario instead of locking yourself into Google?
There are many factors. To start, many companies were already choosing to run a fork of Android in their products, like the Amazon Echo. This gives you access to Android's APIs, developer tools, and knowledge base, which is a significant boost in productivity. By using Android Things specifically, you also get hassle-free OTA updates directly from Google, which is essential for security.
Raspbian is pretty much only supported on Raspberry Pis, so I think the worry of lock-in is worse there; and it's significant, taking into account that it's harder to go from prototype to production with a Raspberry Pi than with a SoM-based board.
The UserDriver system is another factor: Android already does sensor fusion, so if your product uses, say, a GPS receiver, you can hook it up to the OS with a short amount of user-space Java code, and all existing Android code that uses the location APIs now works with your GPS. No need to modify client code nor compile anything into the OS. And your UserDriver works on any Android Things system.
The talks of this year's Google I/O explain these kind of things in more detail.
>Layout of high-frequency electronics has many gotchas
I listened to this mantra for about 15 years but its like anything else that "looks hard" it becomes obvious after doing it.
RF design rules are well documented and pretty straightforward once you get past the smoke, mirrors and doom mongering. All the good RF subsystem manufacturers have white papers, dev boards and fully documented layout design guides for their chips and low power ISM sub-systems under 20dBM are more or less bulletproof.
I have made several commercial designs, my first ones were just copy/pasting the gerbers off the dev boards (I did not even need to buy the dev boards as the gerbers were downloadable for free) all these worked fine and even several dBM of transmission loss due to bad RF or enclosure design is not actually a game changer in most short distance/low power applications.
My first designs I actually understood very little, now I have all the toys to do proper RF design and understand it much better and so long as you read up on the basics there is no reason not to try too. Seriously it costs $15 for a PCB delivered worldwide these days so you can afford to experiment or tell your EE/intern to do it and dont be too surprised when it works.
It can be very hard to hire good embedded developers (I know, we are trying). Also, until you reach a certain scale, the work load tends to be highly variable. What these places need to do is hire one of the many good design houses that can quickly knock out a production-worthy design from an Arduino prototype and put you in contact with reliable contract manufactures to crank out volume.
My employer has an EE/embedded team of 3, and we still rely on a local design house to rev our PCB's, do the generic infrastructure components of the embedded s/w, and design test fixtures. We write specs for electronics and keep the "secret sauce" firmware in house. There are other benefits -- as I write this we have an expensive instrument that we could never justify purchasing sitting on our lab bench because we borrowed it from our design house. We can get a few hours of specialist time here and there for things like FCC compliance testing to augment our in-house expertise.
One person who has been around the track enough laps to be able to write good specs and do program management is sufficient in the early days.
I can do embedded, but also do other programming and programming related jobs.
Thing I've found with embedded, is the pay is generally not as good as other work, and it's generally not as flexible (no remote). If you can do either generic webdev, or other things, those often look like better options.
Im making over $7k a month working part time. I kept hearing other people tell me how much money they can make online so I decided to look into it. Well, it was all true and has totally changed my life. This is what I do, ========http://www.smartfinancemedia.com/?682
The remote part is hard because typically for embedded work you need at least a minimal EE workbench. Yes, the pay isn't as good for some reason -- which is odd given how hard it is to find good embedded people. Perhaps embedded developers as a group are poor negotiators.
Personally, I couldn't stand doing generic webdev -- I'd rather spend eight hours a day poking myself in the eye with a sharp stick. I've always lived on the EE/software boundary. I suppose staring at logic analyzer traces is some other person's sharp stick, but it works for me.
I much prefer lower level programming, preferably programming the hardware I designed. I know the hardware limitations, side effects, and overall the how-to behind the hardware that allows the software to do its thing. Most software engineers I know have no want to understand the underlying mechanics (they're certainly capable of it - they just don't care).
I'm currently trapped in a "devops full stack" with much of the tooling as NIH. My routine day is a mix of keeping the trains rolling on time, and getting horrid stuff like " my $database_root_password = <password> " out of SVN...
I keep applying to positions in my wider local area (Midwest). But so far, only headhunters that want to waste my time with endless "pre-interviews". Tomorrow, it's back to tickets, high-ho!
I'd guess most people willing to do embedded dev remote would have their own workbench.
It doesn't really cost that much and I wouldn't see it as a limiting factor.
But it's kind of moot because the work isn't really available.
That's great that you'd rather poke yourself in the eye with a sharp stick than do web dev. But I have skilled embedded dev friends writing php because the alternative is working at a supermarket (they're not location flexible).
At the risk of going offtopic a bit, do you guys have some advice on where to start with a
basic home EE workbench? I've been playing with simple Arduino circuits/servos/simple control
and would like to improve my workflow a bit to make it more efficient. Some things I have:
- soldering iron and supplies
- "helping hands"
- misc parts/supplies like wire, breadboards, switches, LEDs etc.
- batteries, USB chargers, couple of PC power supplies for power supply
- using a simple 2x1 .1 inch header convention with red and black wires for plugging in DC power
Desk is a basic folding table from Staples.
I would like to figure out where to find a better table/desk (I assume with outlets/etc.) and what other things
I don't know about you'd recommend.
Having been in this exact role (turn a prototype into a product as a contractor), I have no end of love for my R&S entry level MSO. 2 channels analog, 8 channels digital, and it's just a pleasure to use compared to the cheaper scopes. I had originally ordered a Rigol, but my supplier couldn't get any units for 6-8 weeks (backorder) and they had the R&S in stock. It cost 2-3x as much, but... time crunch. Couldn't be happier!
While I don't own one, the TS100 [0] is supposed to be an amazing soldering iron. [1] As a plus, the hardware and software are open source. [2]
> "helping hands"
Realacc six arm third hand. [3] It's fantastic for the price.
> misc parts/supplies like wire, breadboards, switches, LEDs etc.
AliExpress? YMMV for quality though
> batteries, USB chargers, couple of PC power supplies for power supply
Can't recommend batteries or USB chargers from China, too much counterfeit junk. For PC power supplies, Seasonic is a well priced, quality brand. There are a bunch of ATX power switches available (e.g. AntMiner) on eBay for a few dollars.
My personal recommendations, mostly stuff I've got:
Soldering iron: I like the Hakko FX-888D. $90-110 or so. They have better if you can afford it, but that one's very good. The FX-951 is the next step up, and can take micro-soldering handpieces and has the quick-change tips. It's about $240.
Get a chisel tip, eg Hakko T18-S3, a bevel tip (T18-S6), and a bent-conical tip (T18-BR02). The conical tip is perfect for lots of general purpose work, you can use the fine point or the sides of the bend. The back of the bend can be used for drag soldering, the inside of the bend makes soldering wires together easy. The chisel tip is good for soldering things with more thermal mass (PCB-mount heatsinks) and the bevel tip is pretty necessary for drag soldering on QFP and similar surface mount packages.
Hot air station: Probably something cheap from china, there aren't any particularly affordable name-brand ones that I know of. Weller has the WHA900 for around $600.
Magnification: Get at least one of the magnifying headsets ($8-10 on Amazon) and a desk magnifier with LED ring light. Better option is an AmScope stereo microscope, such as the SM-4NTP and a ring light for it like the LED-144W-ZK. $480 total.
PCB vise: I have an Aven 17010, it works pretty well. MUCH better than trying to hold a board in the helping hands.
Flux: Get liquid flux with a syringe. Amtech is the best, but there is a lot of counterfeit stuff out there, and Amtech doesn't sell it directly (bulk orders only). https://mailin.repair/amtech-nc-559-v2-30-cc-16160.html sells the real flux.
Tweezers. Any ESD safe set.
Fume extractor: VERY important for health. You do NOT want to be breathing in flux fumes. A high-volume HIPAA air purifier on the desk works, ($150 or so) or a dedicated device like the Hakko FA430 is even better ($625).
Oscilloscope: Rigol DS1054-Z. 50MHz, hackable to 100MHz bandwidth easily. $400. There's no better cheap scope at the moment (IMO).
Function generator: Siglent SDG805. $270. Needed to give you analog signal inputs. Part of the big-3 of 'scope, power supply, and function gen.
Power supply: Get a linear supply. The Tekpower TP3005D-3 is $200, and is an actual linear power supply. The knobs are coarse adjust only (it's analog), I replaced the control potentiometers with 10-turn versions which substantially improved the accuracy of the output. There's also the Siglent SPD3303X-E ($340) if you want a digital panel version. You definitely need arbitrary +- voltages for lots of very basic circuits, PC power supplies are very limiting and too noisy if you do any sensitive analog design.
Multimeter: Get a safe one (HRC fuses, proper transient voltage suppression, etc.) Can't go wrong with Fluke, of course, but Extech, Brymen, and some others have cheap and capable handheld meters. $100-300, depending on brand. Be sure it has a micro-amp range! The really cheap ones don't, and you WILL need it if designing embedded stuff.
Logic Analyzer: Get a LogicPort. pctestinstruments.com. They're $390, for a 34-channel 500MHz device, very nice for the money. Needed if doing much digital work. (Keysight's 34-channel standalone analyzer is $12165 base price. 5GHz, but still, twelve grand...)
Spectrum Analyzer: If you're doing RF work (radio design), you'll need one. Otherwise skip it. The Siglent SSA3021X with tracking generator add-on is $1764 (pretty cheap) and quite capable (9kHz to 2GHz). It's also hackable / software upgradeable into the 3.5GHz model. The Rigol DSA815-TG is $1550, but significantly worse (smaller display, worse resolution bandwidth, max 1.5GHz, etc).
Be sure to get a GFCI outlet and a GFCI adapter or two. The oscilloscope, function gen, spectrum analyzer, etc, all are mains earth referenced, and should each have their own GFCI plug. If you accidentally connect the ground lead of any of them to something other than ground the GFCI will trip and prevent the ground traces from being blown up inside the device. They're about $20 each, well worth it IMO.
You might want an anti-static mat and wrist-strap.
Get a bunch of small drawers, eg https://www.amazon.com/gp/product/B000LDH3JC. Print labels for them, use them to store resistors, capacitors, and other types. You can fit two values of component in each drawer (though they don't come with enough dividers :/). You want at least 96 drawers for resistors and 32 for capacitors assuming you're buying 1% or 5% resistors and 10-20% capacitors (pretty normal). I bought a kit of resistors (https://www.amazon.com/gp/product/B017L9GKGK) and (https://www.amazon.com/Joe-Knows-Electronics-Value-Capacitor...) for capacitors (Joe Knows Electronics kits are good for stocking up, they have more of the most common components in their kits.)
Get some desoldering wick and a solder sucker too. Also some tip tinner, and/or a sal ammoniac block. Make sure you have a roll of kapton tape to hold parts down while you solder them (it survives high temperatures). If you'll be doing a lot of surface mount you'll want a reflow oven and solder paste.
EDIT: One tip I forgot, very important: When you buy parts (on DigiKey/Mouser or similar) make sure you buy extras. At least the number needed for the first volume discount or 10, whichever you can afford. 3x the number needed for the project at the minimum. You WILL drop parts, burn them out, and otherwise damage them. It's much easier if you already have spares, don't have to wait for shipping, and don't have to pay for shipping. This will also help you develop a parts library, as you do more projects you'll be likely to re-use common parts and already have many left over from past work.
Great list! I too am a embedded engineer and use the stuff you mentioned in my office.
The most important thing for me is the scope because I develop bare metal firmware (includes drivers etc) for both SoCs and microcontrollers. I use a Rigol 100MHz scope and I really like it. Of course I could not go for the higher speed scopes because of the budget.
Also for some of the instruments I have found that Tek and Keysight provides refurbished stuff for much lesser prices. Have you tried getting any refurbished stuff and what is your opinion on it?
For the logic analyzer I found it a waste as a separate instrument. I would prefer to have the analyzer in the scope itself as all in one so that I can do analysis on only one screen.
I've not tried refurbished, though I assume it would be fine. Tek and Keysight make great gear.
The advantage of the PC-based logic analyzer is in the ease of use, mostly in setup. Also in the number of channels, the ones built into scopes tend to be 16 channels. The two-screen thing is a bit of an issue, it would be better if Rigol's PC software for controlling the scope was halfway decent. Tek's software is much better.
As a hobbyist trying not to spend too much money, I ended up getting the following amateur-level equipment:
- Digilent Analog Discovery 2: Low-end 30MHz oscilloscope+function generator+logic analyzer controlled via USB. $279 (or $179 with student discount) I think of it as 'swiss army knife' of electronics: it's not as good as a real oscilloscope, function generator, or logic analyzer, but it does the basics of all those roles and fits in your pocket. http://store.digilentinc.com/analog-discovery-2-100msps-usb-...
- Xprotolab Plain: Same general idea as the Analog Discovery, but only $20 and therefore an order of magnitude crappier. Only 20kHz bandwidth. Suitable for an absolute beginner on a tight budget. http://www.gabotronics.com/oscilloscopes/xprotolab-plain.htm
- Extech EX330 multimeter: It's a multimeter, the most basic instrument you'll need. $53 on Amazon. I recommend getting some alligator clip probes; I find them much more useful than the pointy probes it comes with. https://www.amazon.com/gp/product/B000EX0AE4
I would look for used equipment if that's at all an option.
A "real" scope is a workhorse that lasts forever with the features and the convenience you don't get from those cheaper options. You also want real scope probes. In my area I see someone selling a TDS220 for $200 which would be a good buy. Also an older 4 channel 100MHz Analog scope...
One of the reasons I chose a USB scope is for the portability. Real scopes are too big to stuff in a backpack. I agree the calculation would be different without that.
I haven't gone far enough in electronics to run into limitations of the above equipment; I figure I'll upgrade when I need to.
- Look for a bench-top power supply with a current limiter. You can save yourself from frying your new board with the current limiter.
- I'd get a used good scope. Personal preference would be a digital Tektronix. A logic analyzer can probably wait.
- A digital multimeter. My preference is Fluke. You'll want a buzzing continuity tester in whatever you get.
- You might want to invest/experiment in some method for home manufacturing of PCBs. Breadboards are a pain ;) We used to do wire-wrap back in the day, probably no one does that any more :)
- I find that you can do a lot with two hands. You figure out all sorts of clever way to hold your work, wires and the soldering iron ;)
- Heat shrink tubing.
- Crimping tool(s) and various connectors that are convenient for what you're working on. It's a bit of an investment but you want to connect stuff easily and reliably.
- One of those magnifiers on an arm with a light built in...
This is a great starter list, and I agree with most. A multimeter with an audible continuity checker is an absolute must. It gets used more than any other tool. For bonus point, get a multimeter that can take a thermocouple probe. Not a necessity, but very nice when you need it.
I disagree about home made pcbs. Its just too much fuss and mess, and there are a lot of places that can give you cheap 4 or 5 day boards in small Quantities. AP Circuits usually is a good deal for my hobby stuff, but I always price out others, too.
One word for PCB beginners: DO NOT obsess over making every board perfect. It wont be. For one-off hobby quantites an x-acto knife and some wire will fix most basic mistakes in a few minutes. PCB design is like golf, low scores are better, holes in one are rare. Too often beginners fear fabbing a board because they think it is cast in stone. Mistakes just arent that costly at low volume.
I think the pay isn't as good because the profit margins aren't as good as "the cloud" so there isn't as much money available for engineers. And because the slower pace of development makes it seem like you accomplish less per year.
My biggest problem with web development is the users. I've made a few internal websites to manage some our stuff at work, which only has to deal with a small number of users (100 or so) who are all technologically competent engineers, and I still find it ass annoying, dealing with all the weird ways they use my very simple webpages. Sure hardware can have its quirks but they're not like people.
I'd bet that the reason web/mobile dev pays much better than embedded is that there are many more firms doing it. Embedded means that your company is producing a hardware electronics product, which is requires a specialized core competency that relatively few firms do. By contrast, web & mobile dev just means that your employer has some information they need to deliver to users, which describes basically all of the millions of businesses in the world. More potential employers = higher demand = higher wages, even if the supply of webdevs is bigger and becoming one requires less training.
(Personally, I found the low-level hardware/OS/algorithms courses fascinating in my CS degree, but went into webdev because it paid more and gave me more career options, and then some data science because the combination of UI skills + data wrangling skills means I can actually build useful things on my own. No regrets on my career path, but if I get lucky and cash out on a startup, I think I'd love to do embedded & hardware stuff.)
>> "It can be very hard to hire good embedded developers (I know, we are trying)"
...
>> "Yes, the pay isn't as good for some reason"
Don't these seem related to anyone?
I'm not an expert on business or anything, but maybe the reason companies are having trouble finding people to work embedded jobs is they aren't willing to pay them enough?
>> "Perhaps embedded developers as a group are poor negotiators."
I don't know of a better negotiation tactic than just going to do something else you can also do that pays better (a lot of people I know with deep embedded skills are doing higher level mobile app work these days for exactly this reason) because the industry has some weird preconceived notion of what you are worth...? "Fuck you, pay me".
No, companies making things needing embedded stuff are usually trying to shave off pennies wherever they can. Which means the good embedded devs take jobs elsewhere that actually pay.
Yeah, I miss having a scope on my desk. But my pay is nice enough that I can afford my own EE workbench at home, should I want one.
Remote is the only way as far as I'm concerned - I have a lot more and better equipment than anything my customers have in-house, and my main line of business is helping customers who have an idea and want to turn it into a prototype, but have little or no in-house electronics knowledge. And I agree with you that webdev is a cancer of high-speed growth and change with from what I can tell very little benefit of framework-of-the-week compared to framework-of-last week. Maybe it will stabilize one day.
I too have a decent lab at home and would like to get into similar work as you. Would you mind if you can elaborate on how you get business? From what I have seen most of the customers would like to give their business to their trusted partners and vendors for firmware work and getting in is very tough.
It is extremely difficult. My main sources are word of mouth references and followup projects from existing customers. My 3d printing tool related activity has also brought me some business, and I use my electronics workshops as a marketing tool as well. Still, I have quite a lot of difficulty getting projects, largely because the intersection of (has money to pay external development) (knows what they actually want) and (too small to have inhouse expertise) is fairly small. Where are you based?
I am based out of India. Here quite a lot of projects going into big service/outsourcing companies for some reason. Many of the projects fail but I have seen their customers still send their projects there. The pattern that I see nowadays is that most of the embedded companies prefer to have a vendor who has compliances and processes however stupid they maybe or whether they really follow it or not. It is as they say just to cover their behind.
Another lead that I see is Chinese SoC manufacturers who would like to have Linux/Android up. I see a lot of the produce their hardware but have very little support software. The problem is I do not have any idea how to approach them let alone convince them to spend money on development.
The big companies medical and automotive) in Germany generally give work to their vendors but again they give it to the German companies and very rarely to outsiders. The contracts are very lucrative as I have seen just 1 person companies there providing very average embedded software components. Again very hard to crack the market unless you know someone there.
Yeah, I'm an embedded engineer and honestly the pay isn't that good and the job opportunities just aren't there. I'm seriously thinking of switching over to backend web development.
It's what I did a long time ago. Well-- before the web, it was Windows system/application development. But yeah. In university, I enjoyed compiler design, embedded work and video-game development much more than anything else. Turns out the demand for compiler writers is tiny. The pay for embedded developers is sub-par, and the game industry is just awful. So boring business apps it was.
Turns out to have been a decent career. In the end, programming is programming, and I have enjoyed most of it. There are always problems to be solved.
I'm in the same place. I have better equipment than some of the companies I have worked for, so I wish remote positions were available. I'm even open to burst-y work, so maybe I should be a consultant.
Yep...it's been like that for a while. Undergrad in computer hardware engineering, masters in EE. Done plenty of embedded work in school and internships and thoroughly enjoyed it.
Web dev/backend dev for the last 10 years.
I often wonder who, in the long run, would be the ones to push the boundaries when you can get a very good pay doing some generic software engineering job (similarly I have friends go several years into a STEM phd to drop and go into CS masters program)
They all have their own specialties and pros/cons, but most of them deal with small startups to help bridge the gaps. As always YMMV so due diligence is required.
We used them to design and build a custom board for our Sumo robots that had our V1 built from an Arduino board and a custom board that plugged into the Arduino. We couldn't really afford to pay them a big chunk up front, but we negotiated a pretty pleasant royalty deal that comes out to a few dollars per board (which we get manufactured for less than $10/board).
They did excellent work and I highly recommend them for that type of "prototype -> production" path.
IMO hardware manufacturers should get their shit together.
It is unbelievable that they are not able to provide a decent hardware development platform that can match Arduino in ease of use & documentation, that use decent production-grade components, that's supported for sale for > 10 years and that has a clear path to switch from dev board to own IC.
I don't think it's a question of the state of a Silicon manufacturer's $hit. Nothing about what you propose is technically complex. It just takes a large, long term investment to build and support a platform like that for >10 years. The manufacturers probably haven't done it because they don't see a big enough market yet. Most big consumer hardware co's can just develop their own solutions internally, and hardware startups are still incredibly niche.
The complexity of developing professional firmware is one the main tools chip companies have against their customers switching chips. So they won't go invest a lot of money explicitly against this goal.
> and that has a clear path to switch from dev board to own IC.
The core problem why many startups get stuck with Raspberry Pi, Arduino and friends is exactly the "dev board" problem.
When building an MVP and I have the choice between a $20 Pi/Arduino/Pi Compute Board and a $1.000+ dev board, hell I'll choose the Pi option. Lots more support, especially because any combo of I/O and a Pi has been tried by someone else before in contrast to $weird_sensor+$weird_niche_devboard, and especially you will want about 10 or more units so you can afford to blow a couple boards. This will happen inevitably during development, either by "fat fingering" +12V to a 3V3 input or by blowing the wrong eFuse, and better to lose 20$ than 1k$, not to mention you have to raise the 10k first...
The problem, I think, is that there's aggressive cost-cutting by large players who have experienced teams of EEs working on the projects and are willing to spend a lot on NRE costs if it means they can save pennies on each piece. This does not create a friendly market for self-taught start-up hackers.
I think that's what AVR is; there is a clear (and reasonably straightforward) path from Arduino to production on a custom board. I'm a total hack/n00b and my first AVR PCB worked.
I suspect that STM and TI are of similar difficulty, but I've only seriously played with Atmel because of the Arduino dev board (and related ecosystem).
10 year guaranteed availability of the exact same part is not guaranteed, but there's a clear enough lifecycle policy and paying any attention, you'll have a chance to a make a final "lifetime" buy. (If have the luxury problem that you're selling so much product that a lifetime buy is impractical, the NRE for a redesign is probably manageable for your business.)
The documentation for STM and TI is far, far below the standard of Atmel's, both officially and unofficially. It's more difficult to find information, there's a much smaller development community for beginners and the boards start in TQFP.
That last part alone stops most people from playing with ARM, because you're almost forced to get boards made. At least with AVR/PIC you can prototype most of the Atmegas on a breadboard. Obviously there's a limit to what you can do with an AVR, but you can do a lot with 20MHz.
This may be an ARM thing though. I found it much more difficult to find development documentation for Atmel's XMega platform, I didn't even look at the SAM chips.
TI's website is a rabbit hole though. Sometimes the datasheet is enough, other times you have to go to their weird Wiki which looks unfinished. Sometimes it's available for free, other times you have to log in to get the information. STM isn't much better.
It's a crying shame. ARM is more capable and often cheaper and lower power than going the 8-bit route, but it's a pain in the arse to get started.
In terms of layout though, there isn't much in it. There are datasheets from ST that tell you what the mandatory hookups/passives are. Everything else is more or less identical to any other microcontroller, though you may need to worry about speed.
You can always get MSP430 Launchpads at $9.99 which comes with a MSP430G2x in a DIP. You can breadboard it if you want or you can run it on the board itself. The individual MSP430G2x are also pretty cheap. So if you blow one up you can always replace the chip.
The documentation for STM and TI is far, far below the standard of Atmel's, both officially and unofficially. It's more difficult to find information, there's a much smaller development community for beginners and the boards start in TQFP.
>That last part alone stops most people from playing with ARM, because you're almost forced to get boards made. At least with AVR/PIC you can prototype most of the Atmegas on a breadboard. Obviously there's a limit to what you can do with an AVR, but you can do a lot with 20MHz.
Isn't it why we have development/prototyping boards right? You can always develop/experiment your code in it and also develop your PCB in parallel. In this way when there are problems with your custom board you can always be sure of your code.
In my experience TI documentation is excellent but their software sometimes is over engineered especially TI-RTOS.
Personally I have worked on the STM32F series and I found the documentation good. Also if you want to read through the internals of the ARM architecture you will have to refer to the ARM manuals from ARM website.
>>IMO, the biggest problem is software startups unwillingness to hire other types of engineers.
>>It's easy to bridge from Arduino to Atmel AVR IF you know how to do board layout. Startups need to either grow their knowledge of electronics or hire some electrical engineers.
I agree, hardware can not agile like software is. So you have to re-design your product from prototype(like Arduino).
Even more, you should outsource your hardware design to other professional hardware company, such as design house.
> It's easy to bridge from Arduino to Atmel AVR IF you know how to do board layout.
Design is half the battle: the other half is component selection and manufacturing. If you BOM is way off, you're SOL before you even start, and most people don't get this until it's too late.
This is why you don't design something around an esoteric part, like the only ARM part with 5 Quad SPI ports or a dual time base RTC. If you do, have contingencies for your contingencies, like merging footprints you can have the option to use different IMUs on the same PCB.
These maker boards are for prototyping. If you want to go into production you should make your own board. There's a large step between writing software and making your own PCB though. You would be a truly full stack developer.
The biggest problem is that hardware is a sucker's game. The minute you have to start creating even moderately specialized PCBs for your product, you incur a ton of extra costs.
You have to deal with yields from the fab process, a hardware testing/debugging process that often requires an expensive oscilloscope, an up front outlay of capital just to get the pcbs produced, you have to deal with getting it certified as being 'safe'.
It's easy to bridge from Arduino to Atmel AVR IF you know how to do board layout.
Startups need to either grow their knowledge of electronics or hire some electrical engineers.
The AVR documentation is excellent, you could easily design your own board if you have anything beyond rudimentary electronics skills.
If you're doing something very simple, maybe... but If you're doing something very simple, why do you need specialized hardware? Get something prebuilt that runs embedded c or linux, write your software, attach your controllers (build a nice case), and be done with it.
If you're doing something more complicated, (multiple layers, pcie, etc.) You'll never get the yields that you need (to be profitable) out of your fab process without either a very skilled/experienced EE, or a team and a bunch of money. Even with a simpler (or no) fab process you still have to worry about defects in production and testing for those defects before you ship the item. But at least without a fab process it can be arranged to be someone else's problem when the widgets don't work.
It's not that you're wrong, its just that doing your own manufacturing is either:
a.) Adding a lot of expense to something that needn't be as expensive if you can buy something that already pretty much works in bulk. If you reach the state of mass production, it could make sense to do this yourself, but at that point you may be past the startup/proof of concept phase.
b.) Necessary but very expensive (more expensive than it appears on the surface) and problematic to both your margins and cash on hand. If you go this route you better have some backers with extremely deep pockets who believe in you and are willing to throw in extra cash when the first run of your board has issues and you get a low yield on them.
I agree, however, about the value prop issue.
But that's also kind of why I think hardware is a sucker's game. Either you get screwed by having to make your own stuff, or you get screwed by being dependent on a third party who may not be reliable (or in business, or still producing the thing that you need). Or both, because you're likely getting it fabricated by a third party, which will lead to the same issues as purchasing something 'off the shelf,' plus the possibility of having no one to blame but yourself.
source: did a hardware startup.
edit: I forgot to mention one other factor... If you're doing something high performance, there's a possibility that by the time you're ready to ship the product, it is out of date and there's some faster next-gen hardware out that will do the job better. This is exactly what happened to AMD with Bulldozer (there were some other fuck-ups there too, but for the most part it was superseded by intel's more advanced fab process).
> IMO, the biggest problem is software startups unwillingness to hire other types of engineers.
As someone who co-started (and later sold) a small privately funded company doing embedded software development about 15 years ago I think I can give some insight. We were a 3-person startup, all with a software background. Most we knew was how to use a multimeter and solder a jtag or DB9 connector, which we also needed on a regular basis - but that was about the extent of our knowledge.
One of our very early projects however involved requiring some custom hardware. So we started looking for electrical engineers, and quickly found out we were absolutely clueless about how to interview or evaluate these people. With one guy we interviewed it 'clicked' - and after talking to him, we quickly realized we knew nothing about hardware design, production and everything involved. We would have hired him, but he was very honest about thinking that would be a bad idea and rejected our offer. Looking back, he was absolutely right. You don't just need electrical engineers, you need people with experience in production, hw testing and following up on all those things.
We ended up outsourcing the hardware design and production to another company, and actually recommended them the guy we found, who ended doing most of the design for our project. Stick to what you know best, if you're small and need hardware designed, try to find a company that can do this and believes in what you're trying to achieve. It's easy to lose focus when you suddenly have to learn a bunch of new things - which includes failing a lot, something you can't afford in a startup.
IMHO I can't imagine a technology that is closer to production that the Arduino, it gets you 90% and more importantly instills the limits of the technology before it goes to a electronic prototype house.
Prototypes are easy. Production is hard. This is the current issue with the Tesla Model 3, a few is easy, a full production line with all it's perils that's a massively harder proposition.
Many years ago we worked on releasing an Arduino compatible design.
To do so we had to:
1) Release the board files - fine.
2) Open source the code - ok.
3) Pay the foundation 20% of our retail profit - ~record scratch~ - not going to happen.
There is not enough margin in retail to justify asking for a $40 license per chip for a $5 hardware part. The Arduino compatible initiative was dropped and we built on bare metal instead.
Other Maker boards have some leverage here (Raspberry Pi, Next Thing Co, Electric Imp) because they have control of an exclusive chip supply. Arduino being a clone of the Wiring SDK + an off the shelf chip on a break out board never got to exercise their position effectively in the enterprise / mass market consumer electronics space.
Analogously the Apple ][ started as a development platform for hobbyists that became the business workhorse with VisiCalc. Arduino never left the hobbyist space and scared off a lot of legitimately interested businesses with the ambiguous licensing terms and Genuino debacle.
The previous poster's point is that if you can cram your problem into the limited resources of an Arduino, with its small ATMega CPU and very limited memory, it's not a big jump to a custom board for your low-cost product. Anyone can buy an ATMEGA328 CPU,
the low-end Arduino processor. $1.78 from DigiKey; $1.27 at Seeed. No royalties.
Isn't this more of a lack-of-education problem? There are plenty of maker-grade boards out there that are easier to turn into manufactured product. Arduino is particularly bad and almost an outlier.
An ESP32 has enough CPU juice like an Amiga 500 or PCW 1512 (minus graphics), with network capabilities, so personally I think it is well suited for lots of tasks, given how we used them back in the day.
This is literally the exact niche solution that attracted me to my current company (Electric Imp). I'm still fairly new, but generally the pitch is it's easy to get started, and easy to scale to whatever level you need to, theoretically up to millions of devices for a single product (certainly that many across the spread of customers).
I used Electric Imp in college for an electronics project -- glad to hear it's still kicking! It was a great platform, minus the custom Squirrel language (a Python or Lua library would be more familiar to most people).
Hah, that's exactly what I brought up in the interview - why squirrel?
I've talked to our CEO and the original core team about this a couple times. In their minds LUA, while 7 years ago was more widely adopted, was also less well supported and more volatile than squirrel. Squirrel got the job done, was pretty straightforward, and was small to get going. And, once the decision was made and customers were already compiling it onto live projects.... Well, at least there haven't been growing pains as a result, it still gets the job done and we have metric tons of libraries for it.
We can put abstraction upon abstraction in software, because it's weightless. We have containers and platforms that make it very easy for a hobbyist to create something production quality.
It's not so easy with hardware. If you don't understand your stack fully, things can get bulky and expensive fast. I don't know if there really is a good solution here.
The only thing that exists today is a white glove services. If you have enough money, there are plenty of engineering firms that will take your prototype and build you a production version.
There's SEEED Studio, in Shenzhen. They make PCB boards and assemble them. If you design using only parts in their Open Parts Library [1], they offer quite fast turnaround. They have ARM and ATMega CPU parts. They can also source from DigiKey and Mouser. They have a Design for Manufacturing manual, with lots of information on tolerances.[2] They'll hand-build one-off prototypes for you. Upload your Gerbers and BOM, and they'll give you a quick quote.
Of course, all this assumes you can design a working board. It's not inherently difficult, but there's a considerable learning curve. There are lots of online resources. For board design, there are lots of packages. I use KiCAD, which is open source, but Eagle is probably more suitable for pro work. (KiCAD is a good package, but the component footprint libraries are not that complete and may be somewhat off.)
(I've used them for blank boards. They do a nice job, but it's not super-fast. All their boards have been good, although once, by mistake, I got boards intended for someone in Japan, and they got my boards. Seeed re-made the boards and shipped again.)
For one-off boards bigger than a few square inches I like 4PCB.com. $33+shipping for a 2-layer board, 5 day turnaround, up to 60 square inches for the fixed price. Minimum quantity 1. For contrast Seeed charges $4.90/sq inch for 2-layer and 5-day turn, minimum quantity 5. $66 for their 4-layer boards, though "only" 30 square inches. And they have a 1-day turnaround bare-bones prototyping service: those boards have no solder mask, but they credit the price of the bare-bones boards to your production order, so it ends up being a fast PCB prototyping service that only costs shipping). They're in the US too, so shipping is faster than the Chinese companies.
There are tons of board houses. What's unusual about Seeed is that they will do one-off assembly for a reasonable price, provided you design to their standards.
True, just mentioned 4PCB because they've had very good service in my experience (as a hobbyist/student). I've also used Seeed, they are quite nice, especially for small boards.
This reminded me of pcbshopper.com, it's a pretty neat comparison service for PCB houses. The data isn't always perfectly accurate though, especially because it doesn't take sales or other special pricing into account.
> If you don't understand your stack fully, things can get bulky and expensive fast.
Ironically, software really does fundamentally suffer from the same problem too. It just so happens that today's hardware is so powerful relative to most use cases these days that most developers don't run into those problems, until they try to scale (or they scale successfully, but as a result of the success now have the money and incentive to start caring about formerly tiny problems)
There are some companies out there trying this, but then you're locked into a single hardware solution for your gateway. That tends to meet with a lot of resistance.
Pretty interesting perspective - maybe check out the Hologram Dash which is programmed with the Arduino IDE + has full compatibility with Arduino functions. It includes a cell modem as well: https://hologram.io/dash/
You should probably look at our stuff - electricimp.com
There are over a million commercial devices on our platform, which is essentially a mass production platform that's excessively well documented, tested, scalable, and free to prototype with. Been in the market for 5 years now.
Critically, we take security very seriously and maintain the security stack on every device for its lifetime - whether or not the product owner has cycles to spend on security, we keep it up to date. We're also the first and currently only platform which has been UL2900-2-2 certified - yes an arbitrary standard but it's all sensible stuff.
It doesn't look like an Arduino, but that architecture is not well suited to IoT in our collective opinions, and individual customers with hundreds of thousands of devices on our platform agree.
A silicon vendor's interest generally dies off once they have a design win...
What's wrong with something like the Raspberry Pi? I'm working with it professionally (see my profile) and am very successful at using it in "production". Right now I don't have the need to build a custom board but if I should ever decide to take that path, there's the compute module which was designed for that. It is successfully used by NEC and at least one other display vendor that have a dedicated compute module slot. Also the Pi is still backwards compatible with even the first released Pi. Unlike other platform it's slow moving and you have some guarantee that your investment isn't wasted once some new shiny "board of the week" appears.
Depends on what the definition of production is. Sourcing all the components for 10,000 units and building a reasonable BOM that business will accept is a typical type of production I see... not something that you can do with a Pi.
* I/O is adequate if you're building a PC replacement. It doesn't have the right sort of I/O for embedded tasks (multiple SPI buses, multiple serial buses and so on).
The RPi is a fine learning tool, but don't treat it as production-ready. I strongly recommend the BeagleBone Black instead.
That's not true from my experience. I'm running a digital signage service based on the Pi. So far all devices are very reliable. Of course that depends on other factors such as SD card or power supply. If you don't try to save money on those, Pis run for many years without a single problem.
> CPU changes on every release
Some chipset revision changed (I would have to look up details), but they are all more or less compatible if you don't go too low level. I have no problem running the same build of my software on all Pi revisions. Of course you can benefit from NEON or multiple threads on later versions, but that's all optional.
Totally agree. For a wifi device, the best options to bring the "ease of Arduino" into production are particle or electric imp. Great products but high cost, especially for startups.
Have you looked at mBed stuff? From what I've tried and their marketing, it looks like the wide variety of chips in the ecosystem offer the required spectrum for production.
I wonder if all those declarations by Intel are just a game: Declare something big/interesting with tons of media etc - stock picks up by x%, than a few years afterwards, cancel it without a fanfare, stock goes down by much less than x%. Intel wins. For those knowledgeable about the stock market, could it be the case ?
Because that would be a good explain why did Intel enter this field, which is is a very poor fit for it - unless they come with some breakthrough.
The stock price is driven by what investors think the company is worth. For what you describe to work, investors would need to calculate that the investment Intel made here was profitable (via the sales made, for example), thus adding value to the company. At which point you couldn't really see it as a "game", but a good investment that was now ended.
Don't lose sight of the forest for the trees: OP's scenario is a strategy in which the project's start (with investment) causes a raise in valuation, and its premature cancellation a smaller decrease in valuation.
Intel really seems to be stuck in their commodity PC / Server mentality. The number of customizations that are done these days in mobile and I suspect as things become cheaper IoT is pretty impressive. Intel hates customization since that means other people make money (see the whole NVidia Ion chipset saga). I just cannot see them making any headway with their current "must be high volume commodity product" attitude.
It's quite unfortunate as they were a convenient middle ground between amateur and "we'll sell to you only if you order 500+ units" boards, and the brand was famous enough in less technical circles to help prototypes being taken seriously (given the technical side is rarely what kills off a project).
Well, that took a surprisingly long time to happen.
The maker community represents a trivial (at best) contribution to Intel's bottom line. Intel's bread has always been buttered by delivering high-performance, server-grade chips to the people willing to pay for the cutting edge of performance. And every year, at that! That's pretty much the opposite of the maker market - people who are building electronics for fun, and not exactly flush with cash to spend on it. I'd wager that the net profit of any one of Intel's enterprise customers vastly outstrips the entirety of what they made on their entire maker line of chips and boards.
Why would they bother diluting their production focus and stretching their support engineers thin, to help court and address the concerns of a bunch of spendthrift HW hackers and garage IoT operations?
I always figured the hope was that some small percentage of the makers would develop an actual product with the platform, then continue to use their hardware as they scaled up to sale of thousands of devices.
I'm sure it was. Two years seems like a pretty good amount of time to test the market for a semiconductor company. I think we're just seeing Intel's leadership decide that the test didn't generate sufficient interest.
IBM used to make all of it's money from Mainframes, but then over the years commodity computers have gotten good enough to replace mainframes in most businesses. There are still some customers who need them, but the lion's share of the market is now dominated by commodity x86 servers.
Now x86 is in the process of being disrupted. Look at ARM for example. Currently, most ARM chipsets are focused on mobile applications where low power is necessary. They are not powerful enough to compete with x86 on the server (yes I know there are ARM server parts, but it's still early days), but soon they will be for most customers. And at that point, the power/cost advantage will cause customers to switch over. This is already happening in laptops, with both Apple and Microsoft moving to ARM for their desktop OS. So I think it makes a lot of sense for Intel to try and address this concern, and competing with ARM/MIPS head on for the IoT market is one way to do this.
IMO the mistake Intel made is trying to take their x86 process, which has a value chain designed to serve high margin server markets, and shoehorn it into the IoT market which is dominated by low margin, ultra cheap parts (think ESP8266).
> ...most ARM chipsets are focused on mobile applications where low power is necessary. They are not powerful enough to compete with x86 on the server... but soon they will be for most customers... I think it makes a lot of sense for Intel to try and address this concern, and competing with ARM/MIPS head on for the IoT market is one way to do this.
I agree with most of your reasoning - it's the strategic choice to go after the embedded market that I think was unwise. The makerspace move was Intel trying to compete in a whole different sector than their core competency (embedded devices vs server chips). A more sensible strategic move, in my opinion, would have been trying to optimize the power consumption of their previous generation server grade chips. That allows them to sell through their existing channels (which they're very good at), but segment based on those customers that care about power consumption. It's all of the advantages of ARM, but eliminates a lot of risk to the customer by being x86, and legacy compatible. Plus, it gives the sales guys the option to say "Well, if you don't care about power, and do want maximum performance, we can always discuss our cutting edge line of server cores..."
> it's the strategic choice to go after the embedded market that I think was unwise
I think the way they entered the market was unwise, not the decision itself. For example, Intel could have created an independent org within itself that could use Intel's resources but wasn't tied to legacy processes.
One could argue the never actually entered the maker market. I can't name one prominent project that used any Intel maker oriented device. That's a monumental marketing/product development failure: offer overpriced products that don't really do much (if anything) new, then completely fail to get people interested in them. No one cares about your instruction set here, so you can't ride on that alone.
I saw it coming when they discontinued Joule and Galileo.
I actually liked the Curie chip, plenty of goodies in one die.
From bluetooth LE to battery charger to accelerometer and gyro plus hardware acceleration for k nearest neighbor (cool for gesture recognition). All that on small form factor low energy die.
Plus, I _feel_ safer using a curie for IoT than just using a raspberry pi and never updating a linux distro.
Reminds me of a saying that one of my startup mentors has said many times. "A great way to make a small fortune in home automation is to start with a large fortune." Seems that Intel is taking their step back rather than doubling down on the long game.
I think Intel realized that the 'maker market' is the craft/toy/hobby market not the embedded systems market. Intel is also dealing with threats to its core businesses so better to pull back from a marginal market that can't even support what were the two leading chip companies in the space (Atmel & Microchip) and focus on not losing market share to ARM in other areas.
Not the first time. Intel got into mobile devices circa 2000-2001 or so. Then suddenly they abandoned this area.
I worked on a secure (authenticated, encrypted) port forwarding proxy for mobile devices at that time. Our company partnered with Intel to bring the software to their new mobile devices. We were quite far along, with working demos and all. Then one fine day, word came down from the higher levels in Intel that they are pulling out of that.
The Intel team we collaborated with were split up and sent in different ways within Intel and that was that.
It was bad for us because we put resources into it and were counting on some cash which never materialized, plus the dot com bust was in full downward swing.
I am a worker,i am currently working for ciphergroup where i work as an assistant head of admin.we offer far range of programming and hacking services,which includes website hacking,facebook,twitter,whatsapp,instagram and e-mail hacking.we also do iphone and icloud cloning and hacking.We also engage in GRADE changing and Upgrading.
for hiring and enquiries contact now on cipherhacker1@gmail.com
reply