Welcome to Our Community

Some features disabled for guests. Register Today.

Open Software A-Z CAM

Discussion in 'Interfaces' started by Rob Taylor, Sep 9, 2014.

  1. Rob Taylor

    Rob Taylor Master
    Builder

    Joined:
    Dec 15, 2013
    Messages:
    1,470
    Likes Received:
    746
    Ok, ready for another WALLOFTEXT? :D

    I've been thinking about this for the last couple of weeks, and I'm not really sure why it wouldn't be doable. It seems like a sticking point I've seen frequently is with the Open Hardware circuitry we have going on. Not to say that isn't a valuable service- @openhardwarecoza is sending me one of his new laser boards soon, which I'm looking forward to- but that it doesn't seem like it should be necessary given the array of corporate-backed (ie. warrantied, returnable, professionally supported) open source boards we have the choice of now. Even just the main three- Arduino, RasPi and BeagleBoard, which all provide different solutions at different price points- seem theoretically sufficient to span the array of our CAM controller needs.

    Note that I'm not trying to say we should wholeheartedly replace one with the other, since I think that the only reason the corporate boards exist to begin with is thanks to the continued efforts of electronics guys working on open source options to simplify the tasks for the less technically-inclined. Arduino was made for artists, after all. And keeping them on their toes with continued development of other options is good for everyone. However...

    Even providing the circuit layout and BOM to source locally, custom hardware (outside of a scale manufacturing situation, obviously) isn't necessarily the easiest way for beginners to jump in- finding a local or online custom PCB fabricator that ships and does small enough minimum orders, plus soldering and testing, which a beginner may or may not have the equipment for or be any good at. It seems unfair to restrict use to either plug-and-play through $400 of USB controller and Mach3, or experimentation with Arduinos and soldering. There's a whole swathe of people- the majority, in reality- who aren't necessarily technically inclined- or can put together flat-pack furniture but aren't exactly electronic engineers- and don't have a fortune to spend. That's who I'm looking at. Until they can be targeted- and people like MakerBot are trying- the home 3D fabrication movement is going to stay low-key, depriving it of many people who could potentially be useful to the FOS side of things.

    Software, on the other hand, can be infinitely, freely copied and pasted without any understanding of how it works (just look at some of my recent YouTube comments). It can be put onto the afore-mentioned mass-produced microcontroller/microPC boards which are reasonably priced, and the same custom control created for substantially less than $100. I believe, anyway.

    So, the question becomes, how is this software created? I don't know how many "software guys" there are on the forum, but I'm hoping that even those who aren't will be able to have some kind of input, certainly early on and during testing. My initial high-level flow concept is this:

    1) Take STL or OBJ and slice.
    2) Create toolpaths from those slices
    3) Understand the difference between undercuts and overcuts regardless of the surface angle! (calculus, I guess, from what I recall of second year math lectures)

    4) convert toolpaths to GCode
    5) write GCode to memory (SD card for stand-alone units?)

    6) access GCode a line at a time and, if necessary
    6a) send to controller
    7) convert coordinates to sequence of motor steps for all three axes

    8) determine frequencies (or, pulse widths) to deliver steps per axis over time based on shape of line being followed- constant for straight lines, equal for unity-proportional lines, quadratic for curves.

    9) deliver pulses to stepper controllers according to predetermined parameters.

    10) Goto 6.

    This is compiled from what I've found over the last week or two. I don't think I've missed anything, though the research I've done into CNC controllers is fairly light, so it's altogether possible. That's where the hardware guys come in!

    Options:

    a) GCode- is it the best solution? It's the current industry standard, as far as I'm aware, but it seems outdated and bloated. Is it time for a new Open CNC standard?

    b) Other control: servos, motors, feedback, etc. This is designed to run steppers, but later versions should allow switchability.

    c) Acceleration. I keep hearing it's a thing, but I haven't quite determined exactly how so yet.

    d) Language- I'm thinking Python for ease of interfacing via Ethernet to a BeagleBone Black, which I hear supports simultaneous switching of GPIO pins and will run Android 4.0 or Ubuntu, software generally somewhat familiar even to the less technically inclined. It should be fairly straightforward to update or change to different ARM microPCs, or switch to say, C++ as necessary.

    Is this just another FOSS reinvention of the wheel? Does LinuxCNC do all of this already? My issue with that is its precompilation with Ubuntu LTS. I want something that can be downloaded onto WinPC, Mac or Linux and a Git with, say, an Android app to sideload onto a Beaglebone, and an Ethernet cable later it's ready to go. Minimal technicality, maximum flexibility.

    I'm not expecting this to all come together inside the next 3 months, but it seems like a valuable direction to go in. Thoughts?
     
  2. Rob Taylor

    Rob Taylor Master
    Builder

    Joined:
    Dec 15, 2013
    Messages:
    1,470
    Likes Received:
    746
    Mods: Not sure whether this is best in interfaces (since it's designed as a hardware replacement) or CAM. Move as you see fit.
     
  3. Rob Taylor

    Rob Taylor Master
    Builder

    Joined:
    Dec 15, 2013
    Messages:
    1,470
    Likes Received:
    746
    Oh- jog control and reference holes for multi-sided (or 4-axis) milling, of course.

    I need to look into how the model interpreter module would detect rotational symmetry- perhaps relevant parts are labelled as such within the 3D environment as part of setup? There may be an automated way already invented, I just need to find out.
     
  4. Tweakie

    Tweakie OpenBuilds Team
    Moderator

    Joined:
    Jan 18, 2014
    Messages:
    784
    Likes Received:
    326
    Hi Rob,

    Certainly great food for thought - please keep the ideas coming :thumbsup:

    Tweakie.
     
  5. David the swarfer

    David the swarfer OpenBuilds Team
    Staff Member Moderator Builder Resident Builder

    Joined:
    Aug 6, 2013
    Messages:
    3,238
    Likes Received:
    1,815
    some big leaps there. good translation of Gcode commands to step or (esp) servo controls is why you pay 6 figures for a real CNC machine. this is hard to do well, and has been done very well with LinuxCNC and Mach3. The dedicated hardware solutions like Planet-cnc other USB solutions wille ventually catch up, IF there is enough horsepower in the chosen microprocessors.
    yes, very definitely. It is a stable , standardized language, with plenty of scope for custom expansion.
    You can feed basic Gcode to a Haas or Fanuc (etc)(such as is created using Sketchup and the Sketchucam plugin) and it will work just fine. Or you can use $100k+ software to create custom Gcode for one of those controllers and it will work just fine on the designated target, but not on the other.

    Just like C has been around for a long time and works on everything. C++ and C# are just extensions, delve a bit and their hearts are C (-:
    already catered for by LinuxCNC natively, and by Mach3 by using the correct drivers like the ones from Gecko that take stepper signals in and output servo control voltages.
    not just a thing, but the most important thing to get right in a CNC controller. stepper motors will jam or just lose steps if it is done wrong. very important is SMOOTH acceleration. jitter in the step timing will mess with this and you will never get any useful speed out of the motors.
    LinuxCNC is already being ported to the beaglebone (-: to me this is a nice solution, a small footprint that can run a real Gcode processor without needing a dedicated screen and keyboard. because LinuxCNC runs under Linux it can already be run via the network through X sharing. This means that the 'program' runs on the host hardware (PC or beaglebone) but the GUI appears on a remote PC. This has the added advantage that the host PC does not have to process the GUI stuff so it makes it run faster, just in case you need that.

    Not sure about Python. It is interpreted at runtime and consequently slower than a mature compiled language like C. Can it keep up with the needs of realtime control without a gaming motherboard class of hardware? We'll see. I just don't have time to research this, sadly, could be fun.
    pretty much a reinvention, yes. LinuxCNC is massively powerful in the stuff it can do. Mach3 works pretty well on PC but keep in mind that to do that they are doing some internal tricks to fool Windows into working in a predictable 'realtime' way. Good luck getting it to work on Windows 8 /9/10 etc!
    Referring back to acceleration. pulse timing is critical, and if the OS cannot support predictable thread timing, you cannot do it. This is built in to the realtime Linux kernel, but not Windows, so they have to do some heavy trickery to make it work. I did a lot of semi realtime programming under MSDOS6, years ago, and it was hell in there trying to get predictable interrupt timing.

    What LinuxCNC will need is either software or hardware replacement for the parallel port. However, you can still get PCI and PCIe parallel port cards and though not all of them work, some do and should be supported so we can enjoy them into the future.

    Maybe the Beaglebone Black is the future or maybe an extension of it that has more IO and is realtime friendly. The front end to any CNC Gcode interpreter is just prettiness on top of some important hardware and software.
    If that soft/hardware fails people can get hurt, I've seen a multi-ton machine jumping around the floor because the microcontroller locked up, we had to pull the 3 phase power cable to stop it.

    As to portability, the only way you can do that is with external hardware that does the realtime stuff, and a GUI on your choice of Win/Mac/Linux. Java and Python can do the GUI end ok, but please have a real hardware emergency stop switch handy (-:

     
    JustinTime and Rob Taylor like this.
  6. Rob Taylor

    Rob Taylor Master
    Builder

    Joined:
    Dec 15, 2013
    Messages:
    1,470
    Likes Received:
    746
    Thanks, @Tweakie. Will do. :)

    That's good to hear, because long before a project like this will be ready, I'll be using LinuxCNC. Servos are tricky, though I'm looking hard at linear feedback mechanisms over in the thread in Concepts and Ideas, which may help in those kinds of situations.

    That makes sense- there's always the latest and greatest thing, but everyone just goes back to the original after a while. Ok, GCode it is. It'll be more interpretable by people who are already into the 3D thing, too, keeping it open to both existing and new "members" of the small scale manufacturing movement.

    Yeah, I think any fully-realised software solution will have to be able to control anything thrown at it. The Mach3/Gecko (ie. crazy expensive) setup is what I'm working to circumvent for the next generation in terms of plug-and-play-ability. It's all that's available for the less-technically-inclined right now, but I see that as a problem that needs fixing. Not to say that they aren't excellent at what they do, but there has to be a better way.

    Interesting. I've done some experimenting with finding the "shortest pulse" that'll run a stepper, but what you're saying is that as the motor's speed increases, that "shortest pulse" actually decreases. That makes sense. Sounds like it's time for me to bust out the Arduino again, though I imagine the results will vary somewhat by specific motor model.

    That's good, though I don't see the BeagleBone as being sufficiently powerful to run an entire desktop OS plus 3D motion control applications. That's interesting to know about the LCNC networking though, I like that. I have plenty of computing horsepower available here in my office, but the machine will probably be downstairs in the basement shop. I'll look into that. I'm guessing that Mac should also be able to do some form of "X sharing" if it's a core function, though I'm on Windows so I can't test.

    Fortunately, I have that hardware! My workstation (well, sort of- it's only i5) should be perfectly capable of keeping up with whatever I throw at it, though your requirements thoughts aren't meritless- plenty of people use dedicated media centre PCs or tablets, and that may be as much processing power as they have available. Maybe for a program of this size, I'd be better going with something like C++?

    I figured. :p But I think of it as more than a reinvention, more of a democratization. Essentially, I'm trying to get the power of LinuxCNC- or some of the power, at least, onto every computing platform.

    Since the PC is really just a GCode generator- and perhaps interpreter, if your concerns about power=quality are true- I'm not too concerned about the computer's ability to send pulses. That's the job of the BeagleBone Black, and specifically why I chose that over Arduino or RasPi for this application. It's basically just accepting network-sent streams of commands and passing them on to the motors at the precise time. Or interpreting GCode, a line at a time, and sending that to the motors as appropriate. I don't know if Android's Linux kernel is capable of the same real-time operation, or if I'll have to try to get a lightweight Debian distro to play nice with it instead. That's something to look at.

    The overabundance of ancient technology makes me sad. :p Ethernet, I believe, is the future.

    Looks like the BBB hardware can be realtime as long as the software is, but there are patches available to make Linux so: https://groups.google.com/forum/#!topic/beagleboard/pKpEoNL0YtI - any comments on that discussion?

    Yep, hardware kill switch will absolutely be in place- on the front, operable by knee if necessary. My only concern is how to get it to operate both the 220V spindle VFD and the 120V PSU for the stepper drivers- preferably without cutting power to the controller board/computer. I'll have to determine that later on.
     
  7. David the swarfer

    David the swarfer OpenBuilds Team
    Staff Member Moderator Builder Resident Builder

    Joined:
    Aug 6, 2013
    Messages:
    3,238
    Likes Received:
    1,815
    No, I am not saying that at all. the way the hardware works is that it will step the motor to the next position on every low to high, (or some will do high to low, and a few will only detect the high state itself) logic transition. but the high or low state will have to be maintained for a short while, the pulse length, before transition back, ready for the next pulse/step.
    It is the gap between pulses that sets the motor speed, and that is the pulse timing I spoke of that is so critical.
    but ethernet struggles with realtime, which is why your Skype chat gets the stutters (-:
    I'll look at it after lunch...
     
    Rob Taylor likes this.
  8. Rob Taylor

    Rob Taylor Master
    Builder

    Joined:
    Dec 15, 2013
    Messages:
    1,470
    Likes Received:
    746
    Oh, ok. Yeah, that I know- my last project was also custom software, but that was Arduino-based and the timing wasn't so critical. Typically pulse edges were around 30-60 seconds apart!

    The timing being critical at least explains the smoothness of CNC machines- I had first, when this overall scheme first came to me, thought about sort of "digitising" the paths, and just driving each axis individually, one step at a time. Then I realised that would be horribly slow and inaccurate!

    Ugh, don't remind me! Hahah. I just need the Ethernet to be a fat pipe, I'm not to concerned with its realtime-ness. The realtime concerns only come into play once the step data is inside the BBB. However it gets there doesn't especially concern me.

    Cool. A couple of different Linux patches to "make it realtime" are mentioned, I wasn't sure how that meshed with your earlier comment that it already was. Though maybe you didn't mean vanilla install.
     
  9. David the swarfer

    David the swarfer OpenBuilds Team
    Staff Member Moderator Builder Resident Builder

    Joined:
    Aug 6, 2013
    Messages:
    3,238
    Likes Received:
    1,815
    Rob Taylor likes this.
  10. Rob Taylor

    Rob Taylor Master
    Builder

    Joined:
    Dec 15, 2013
    Messages:
    1,470
    Likes Received:
    746
  11. Dawai

    Dawai New
    Builder

    Joined:
    Oct 4, 2014
    Messages:
    19
    Likes Received:
    4
    I was a electrician since the late 70s, instrumentation, robotics, Mostly retired now. PROBLEM, at one time in the 80s everyone had their own "robotic" interpreter language. Seems a number that stuck in my head was 600,000 languages to use, every company had their own. THE main one that stuck was the fanuc Gcode, it has slowly evolved into something you can "universally direct" and command, easy to learn, mostly. When you get to the new Delta bots I just get confused, cause I think in Cartesian.

    Saving files for collision avoidance? I thought learning How a DirectX 3d .x file was laid out was the trick, putting the "sphere" and "shaft of tool" tip into it and looking for when it traced the solid file parts.. SO, to do this, first you will need a realtime scanner to revolutize the world. Overlay the part as you cut it, lay it over the desired stl or .x file and nibble away with your tool tip-shaft..

    Really want to help everyone? build a sixteen bit channel encoder where instead of reading the "encoder on the end of the shaft on a servo that the drive interprets for you?" , or "trusting" the stepper to be where it is supposed to be? it reads the binary bits on a Device geared much higher resolution than the axis movement. I had one drew up at one time on photographic film that went reel to reel with the machine. It never worked cause I didn't have access to a laser to create it, and ... never did pursue it further. to modernize the concept You'd need say a blueray CD disc and mechanical spiral tracker, block reader and it would revolutionize the positional feedback for all of us building tools and toys. A reality check for every robotic device.
     
  12. Rob Taylor

    Rob Taylor Master
    Builder

    Joined:
    Dec 15, 2013
    Messages:
    1,470
    Likes Received:
    746
    Slight update. Time for more stream of consciousness rambling, as my brain wends its way toward a vaguely solid idea of what this project is supposed to be! Apologies for the likely repetitions as I attack the same points from multiple angles. Again, I haven't fully researched the topic yet (though I'm more informed now than in my original post), but to a certain extent I'm trying to approach this through rational thinking, rather than simply copying what's already been done. As I learn more, I may incorporate those ideas or feel like my alternate method is more appropriate, as applicable.

    After talking to an electronic engineer friend of mine, I've found two methods of making an Arduino do what I wanted the RasPi/BeagleBone to do. My original concern was the difference between true simultaneous parallel outputs vs sequential outputs across the pins, which obviously would never do for a project like this.

    That's where I got confused with "realtime". Ultimately, a CNC machine doesn't need "realtime" so much as it needs simultineity (plus the odd bit of realtime interrupting for safety and error catching, of course, but that's a function of almost every control system available). CNC isn't an application (or at least, my projected implementation isn't) where catching, processing and passing instructions within the ten-microsecond region is required. "Realtime" in terms of "can create a single simultaneous signal from multiple lines of code", or "sends multiple instructions at a single specific time regardless of their length in the codebase" is more useful for this purpose, and is achievable using Arduino.

    All a step converter is really doing is little different to what Photoshop is doing when you hit "stroke" on a pen path. It's rasterising a set of mathematical instructions, converting it to a listed series of points based upon various parameters applied to those instructions (eg. scaling). Instead of pixels, we have motor steps, but the net result is the same.

    You could argue that the realtime is required for things like curves, where the two (or more) axes are getting sent pulse trains at various times and different rates of acceleration, and therefore the frequency division has to be very fine-grained in time. But realistically, if you "rasterise" the GCode (or the orginal NURB, whatever), you end up with the same step sequence as would be sent traditionally, only you don't actually need to worry about when something's being sent or arriving, only that it happens 1) simultaneously across all axes, and 2) in the correct sequence.

    So back to the point of achieving this with Arduino. Well, sort of. The first uses Arduino, the second is more a function of the ATmega328 abilities than anything intrinsic to the Arduino library.

    1) Parallel multiplexing. Using a non-parallel series of output pins, synchronised via a clock source. This can be used for driving LED multi-digit displays without a driver by relying on persistence of vision, but by using switches off the pins (ie. transistors) the pin outputs can be passed and synchronised via a single clock pin directly to however many stepper drivers are in use. This may be slow if reading and pin voltage setting code isn't ultra-tight.

    2) Port manipulation. I don't know if this is how they do these Arduino Mega CNC builds, but this non-Arduino function of the ATmega328 to output genuinely parallel signals may be the best- if not necessarily the simplest. I haven't looked into how to incorporate this into an Arduino project, but it is listed on the Arduino website, so I assume it's possible. This should be faster since it's a lower-level method, allowing quicker feed rates.

    Which one is better for this project depends on the exact nature of my compromise between plug-and-play-ability and ease of troubleshooting. Using non-standard code may make it harder for inexperienced users to see what's going on, but it does mean that no extra hardware is required other than the Arduino and the stepper drivers.

    Step sequences of an appropriate length can be sent periodically via serial from the GCode interpreter computer, or loaded in text format via SD card with an additional options/setup file.

    So yeah, that's where I'm at in my thinking currently. I have other projects eating 99.9% of my time, so the research on this is few and far between, but I have made some degree of progress! As always, thoughts and criticisms are welcome. Anyone who has direct practical knowledge of exactly how the GShield, seeduino, Mega 2560 CNC, GRBL, etc ideas work as relates specifically to this project, any input would be appreciated. Otherwise, my next update like this may not be until I've researched those and seen what's what.

    Yeah, I've seen examples of GCode a little more since writing that and it definitely seems nice and straightforward to my mathematically-inclined brain. My original issue with it was that it was based on the sequences of individual movements that analogue machinists have to do, and that seemed quite limiting for a robotic device. However, in terms of manual writing, of troubleshooting, of conversion... It seems to be the best option.

    That would be cool- no reliance on pre-existing motions, simply material is there or it isn't. I imagine this could be done with a high resolution camera and some kind of greenscreen-type keying software, that can tell the difference between material and everything else. Variable depth and curved 4-axis surfaces might be tricky though.

    As it happens, positioning was on my mind even before this project, precisely in line with your thoughts. I made a thread for that over in the Concepts and Ideas section: http://openbuilds.com/threads/rangefinder-linear-encoding.665/ - feel free to add your optical ideas to the thread if you'd like, I'll certainly keep them in consideration.

    edit: didn't finish off the parallel methods section!
     
    #12 Rob Taylor, Apr 19, 2015
    Last edited: Apr 21, 2015
  13. Rob Taylor

    Rob Taylor Master
    Builder

    Joined:
    Dec 15, 2013
    Messages:
    1,470
    Likes Received:
    746
    Relevant to this topic:

    I've been reading through this RepRap thread on G61 vs G64 motion modes- http://forums.reprap.org/read.php?147,106954 - and it seems to sum up- or at least explicitly mention- the two concepts I'm juggling; the one-board solution where the motion path is calculated at runtime and realtime commands are sent to position the head along the calculated path (which seems like a horrible waste of resources to me, but that may be a function of ignorance in some way) and the two-unit solution where a more powerful processor calculates the motor steps (or possibly the coordinate sequence when used with active positioning) required for the total run and then saves the entire sequence to be performed by a low-power secondary processor. This reassures me that my thinking isn't awry and I just need to focus on the CAM sequencing logic.

    Obviously the latter is currently my preferred option; it allows the operation to run on very basic hardware since everything is running optimised for its own best abilities- the non-realtime positioning calculations can be performed at least thousands of times a second by a properly programmed multi-threaded algorithm on decent hardware, then sent to a basic microcontroller that's just moving motors around according to that data stream archive. Nowadays even mobile devices could serve as the primary processor irrespective of their available IO options.

    That seems like the "obvious" option to me as a relative outsider, but who knows, as I delve into this more, it may become apparent that the realtime method is the only one that makes sense. Right now I'm dubious, however. I understand the value of comparing the current position to the next position in the sequence, then moving if the difference is more than half a step, but even if you do that with an algorithm that doesn't so much rasterise the path graphically (as it were) as use some kind of path-following idea where it moves a set distance along the path, rather than comparing the path to a designated grid, it still makes more sense to me to run that on higher power hardware and let the microcontrollers just do their real job. Basic measuring and movement is all I see there being a need for for the actual motion control equipment itself.

    That's my latest thinking, anyway, I'm enjoying this weird "let's rebuild CNC from logical first principles" thing. I'm sure in a delightful twist of irony, I'll just come up with what's already one of the most popular methods and the whole thing was fairly irrelevant, but I'm hoping the deep-delve thinking is at least somewhat interesting to follow along with.
     

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice