CZK - 7,5cm tankový kanon vz.39/44N (úprava kořistní zbraně pro ST-I)        
Autor: Jiří Tintěra

Datum: 10.08.2017 19:32:31

7,5 cm tankový kanon vz. 39/44 N 7.5 cm tank cannon Mk. 39/44 N
Originální název:
Original Name:
7,5 cm tankový kanon vz. 39/44 N
Rheinmetall-Borsig AG, Düsseldorf /
Období výroby:
Production Period:
Vyrobeno kusů:
Number of Produced:
neuveden (Not specified)
Prototyp vyroben:
Prototype Built:
ST-I (1)
Technické údaje:
Technical Data:
1350 kg 2976 lb
75 mm 2.95 in
75 x 495 mm
Délka hlavně:
Barrel Length:
d/48 mm (2) L/48 (2)
Celková délka:
Overall Length:
neuvedena Not specified
Používaná munice:
Ammo Used:
tříštivo-trhavý granát vz. 34
protipancéřový granát vz. 39
protipancéřový granát vz. 40
kumulativní granát vz. 38, verze B
kumulativní granát vz. 38, verze C
Mk-34 High Explosive
Mk-39 Armour Piercing
Mk-40 Armour Piercing
Mk-38 Granate with Shaped Charge, mod. B
Mk-38 Granate with Shaped Charge, mod. C
Maximální dostřel:
Maximum Range:
~7,7 km ~4.8 mi
Rychlost střelby:
Rate of Fire:
12-15 ran/min 12-15 rpm
Úsťová rychlost:
Muzzle Velocity:
740 - 790 m/s 2428 - 2592 ft/s
Kořistní zbraň upravená ve Škodových závodech Plzeň.
(1) 7,5 cm stihač tanků ST-I
(2) Dostupné prameny uvádí dva rozdílné údaje o délce hlavně: 3600 mm a 3375 mm
Prey weapon modified at Skoda Pilsen.
(1) 7.5 cm tank destroyer ST-I
(2) Sources show two different length barrel: 3600 mm or 3375 mm

Výkres VTÚ: Nápisy na dělech, č.v. I-3C1-1-2/1, in VHA Praha, FL: MNO VTV, rok 1947, KL: 167, SL: 13871,
Krátký popis 75mm tanového kanonu vz.39/44N ..., in VHA Praha, FL: MNO-VTVM, rok 1952, KL: 65, SL: 24-1/242.

          USA - AN/TPQ-53 (délostřelecký rádiolokátor)        
Autor: buko1

Datum: 10.08.2017 12:33:36

Originální název:
Original Name:
AN/TPQ-53 Quick Reaction Capability Radar
mobilní dělostřelecký rádiolokátor mobile artillery radar
Lockheed Martin Corp., Syracuse, New York
Lockheed Martin Corp., Owego, New York
Lockheed Martin Corp., Moorestown, New Jersey
Lockheed Martin Corp., Clearwater, Florida
Období výroby:
Production Period:
Vyrobeno kusů:
Number of Produced:
~ 170 ks (pre / for )
26 ks (pre / for )
Prototyp vyroben:
Prototype Built:
Technické údaje:
Technical Data:
Anténne vozidlo:

Vozidlo s riadiacim stanovišťom:

Antenna vehicle:

Command vehicle:

Anténne vozidlo:

Vozidlo s riadiacim stanovišťom:

Antenna vehicle:

Command vehicle:

? ?
Pracovná frekvencia: pásmo S (2-4 GHz)
Dosah: 60 km
Frequency: S-Band (2-4 GHz)
Range: 60 km
Další údaje:
Other Data:
Systém sa skladá z dvojice vozidiel (anténa, riadiace pracovisko) a dvoch prívesov s energetickými jednotkami.
Rádiolokátor s elektronicky vychyľovaným lúčom je schopný zamerať letiace delostrelecké rakety, projektily a mínometné granáty na vzdialenosť až 60 km a na základe ich dráhy určiť miesto odpálenia a miesto dopadu.
Radar consist of 2 vehicles (antenna and command vehicle) and 2 trailers (energy units).
Mobile Active electronically scanned array counter-battery radar is specifically designed to locate the firing positions of both rocket and mortar launchers to 60 km range.

Uživatelské státy:
User States:

(od roku 2017)

(from 2017)
Vzdušná prepraviteľnosť: C-17 Globemaster III Air transport: C-17 Globemaster III

          Creating web sites with Suave: How to contribute to F# Snippets        

The core of many web sites and web APIs is very simple. Given an HTTP request, produce a HTTP response. In F#, we can represent this as a function with type Request -> Response. To make our server scalable, we should make the function asynchronous to avoid unnecessary blocking of threads. In F#, this can be captured as Request -> Async<Response>. Sounds pretty simple, right? So why are there so many evil frameworks that make simple web programming difficult?

Fortunately, there is a nice F# library called that is based exactly on the above idea:

Suave is a simple web development F# library providing a lightweight web server and a set of combinators to manipulate route flow and task composition.

I recently decided to start a new version of the F# Snippets web site and I wanted to keep the implementation functional, simple, cross-platform and easy to contrbute to. I wrote a first prototype of the implementation using Suave and already received a few contributions via pull requests! In this blog post, I'll share a few interesting aspects of the implementation and I'll give you some good pointers where you can learn more about Suave. There is no excuse for not contributing to F# Snippets v2 after reading this blog post!

          Kommentar zu Docs & Demo von Digby Maass        
The maps have stopped working! I've posted on the WP support page: Suddenly the map containers are either blank or show the map but when enlarged the small map goes up to the corner, the rest of the screen fills with a blank, and it is unresponsive. The same on site as on a localhost installation. Error console shows a whole lot of Google js errors: TypeError: 'undefined' is not an object (evaluating ',_.l("O"))') - common.js:216 js:38TypeError: 'undefined' is not a function (evaluating 'a.set("length",a.j.length)') stats.js:9TypeError: 'undefined' is not a function (evaluating '_.x(l3,_.J)') util.js:30 SensorNotRequired, SensorNotRequired: util.js:30 Google Maps API warning: SensorNotRequired: 22common.js:9TypeError: 'undefined' is not a function (evaluating 'b.getSouthWest()')
          Luke Van Hook Paintings Now at Brand Library Galleries "Circle in the Square" Yesung Kim, Barbara Kolo, Susan Sironi, Cheryl Walker thru Sept 5th 2008        
The Entrance to the Brand Library Art Galleries in Glendale, California hosts a prominent postcard of the show "Circle in the Square" now exhibiting through September 5th, 2008


Photo above: 
Cathy Billings, Art Librarian and Gallery Manager of the 
Brand Library Art Galleries and Co-Curator of 
"Circle in the Square" selected Luke Van Hook 
as one of the artists to show his circle paintings 
which explore Giotto's fabled "perfect circle.
Photo below: 
Alyssa Resnick, Senior Library Supervisor, Gallery Director 
and Co-curator pictured with Luke Van Hook.
Both ladies made studio visits all over Los Angeles and surrounding communities in search of the "perfect circle" of artists to represent the illusive qualities of the circle.
It takes over a year to prepare for a large show at the Brand Library Art Galleries and no one will have a better story to tell you about the waiting process than Galleries Manager and Curator, Cathy Billings or Alyssa Resnick, Senior Library Supervisor and Gallery Director. These ladies traveled to Inglewood, California for a studio visit to see Luke Van Hook's circle paintings some time in the early summer of 2007. They told Luke that they were preparing to curate a show of artists working on the motif of the 'circle'.  They had already reviewed a number of artists and found making the final decision difficult, first because there were a number of artists who worked with this subject and secondly, the talent was very competitive. The subject of the circle and how each artist approaches this topic is worth dedicated study in and of itself.  These lovely ladies, Cathy and Alyssa, with a keen eye for artistic talent, selected a total of five talented artists to show together this summer.   
Here you will find photos of how each artist expressed their obsession with the circular form.  I'll begin my blog entry with a brief history of what I believe may have led Luke Van Hook to painting the circle and continue with the photos and biographical information of the additional four artists each selected for working with the motif of circles, independently of each other, with their own unique and individual interpretations of the circle: Yesung Kim, Barbara Kolo, Susan Sironi, and Cheryl Walker.
Luke Van Hook began his present study of the circle in 2005. He first discovered the legend of Giotto's "Perfect Circle" in a class about ancient history; but the idea didn't sink in at first. He needed time to reason with his quest. While Luke approached the specific task of painting the circle with thin paintbrushes and applying layer upon layer of color to a raw naked canvas, I set about trying to understand what the hell prompted my husband to go circle crazy in the first place.  I started researching what the circle meant and I found a lot of literature in the realm of magic, rituals, mathematics, secret societies and romance. But my first impression was that the circle was a way to get back to the beginning of things.  Then I delved deeper.  Was Luke trying to say that he was going in circles?  Were we at this artistic point in our lives as a result of a past life?  Was our circular existence referencing our cycle of birth, death and rebirth?  Or was the answer more basic than that, like "the earth is round and it's an orbital thing.' There were other issues on the table I was urged to deal with also.  Were these circle paintings partly influenced by the school we had attended?  Once we leave school we are expected to make works of art that have fresh meaning and to blow out the cobwebs of old thinking.  While at Otis College of Art and Design in Los Angeles, Luke Van Hook studied all the required areas to excel in his chosen profession as a fine arts painter including the figure, landscapes and abstracts. But the abstract visual image is what finally drew Luke back in.  Could it be the understated obvious fact that the big 'O' (which formed a circle on every memo, syllabus and brochure in the name of Otis College) was influencing him subconsciously?  
Luke's earlier work involved intricately small hatch marks that evolved into large abstract images full of vibrant colors.  This work was very reminiscent of Jasper Johns.  So where did this circle idea really emanate from?  Did his hatch marks get married or what?  Observers of Luke Van Hook's work have stated that it raises the question, 'Is it a painting or a drawing?  Is it text or writing?'  Luke will often begin a row of circles that reads from left to right just as western literature is expressed.  But sometimes he changes his mind, and the direction of his technique, and he starts to paint his rows from right to left. At other times, he completes a horizontal column of circles which refers more to ancient Asian forms of writing going from the top, down.
During his graduating year at Otis College in 2004, Luke went on a mission to explore machine technology as it pertained to replacing humans.  He painted large canvases with a number of faces and shapes that represented cyborgs expressing the fear, uncertainty and ambivalence that humans have toward our technological future.  But once out of school, a full year later, in 2005 Luke seem to have turned a corner.  He seemed to have replaced his fear of technology with a competitive defiance that defied all reason.  Luke started working with his father-in-law, in his machine shop, where he started to observe how everything around him involved the circle in one way or another.  He watched the machines (Fadal CNC's- numerical control production machines) in action. The tool would spin in circles, plunging in and out of aluminum, stainless steel and plastic materials. The space left behind was almost always a perfect circle.  Perhaps, this was Luke's starting point. It was the first time he'd really seen a machine make simple circles and Luke probably said something to himself like 'I can do this! Just watch me!' then promptly, decided to take on his destiny. To compete with a machine, may have been the early impulse that drew Luke to paint the circle, but the legend of Giotto's 'perfect circle' was what has kept Luke going full steam ahead into production of abstract works of art.  The initial pieces he created were prototypes. These were the experiments he and his father-in-law Luis Ingels, worked on before moving into the hand made pieces. As his first experiment, Luke inserted a paint brush into the collet of the machine and programmed the coordinates to match the canvas. He overshot his calculations and the brush came crashing down upon the canvas; the collet smashed the brush right through the canvas and even broke the frame. Perhaps, Luke might have thought as he and my father, Luis, looked at each other, 'it was time to go back to the drawing board'. Undaunted by initial failure, Luke did complete an entire series of machine made circles before he went on to the main event, the competition of drawing the circles, one by one, by hand.  
Each piece of artwork created since his first attempts, is meticulously reinvented into creative visual landscapes layering circles upon circles of color schemes in gradations of complementary hues.  The colors reveal very subtle changes.  The circles pull the eye in.  The images seem to have a life of their own, a vibrant quality of pushing the viewer to look for patterns while pulling the eyes into fishers, crevices, or 'wormholes' as one collector observed. I have witnessed the intimate evolution of Luke's circles only because I have the honor and privilege of being Luke's wife.  The fact that I am discussing my husband's art work is of significance only in the sense that it is somewhat rare, although not unheard of, for the artist's loved one to interject a provocative discussion of the artwork publicly in a blog; however, this is a sign of the times we live in today and I feel blessed as a writer to have this open forum to share with you the joys and struggles inherent in Luke's artistic process.
The way I see it, Luke has taken on  the impossible task of creating the perfect circle, where no perfect circle has ever existed before, despite Giotto's legend.  All mathematical equations to date reveal that there is no perfect circle. It is a myth. So why Luke has persisted in this impossible feat only reminds me of the story of Don Quixote. Here is where I see Luke chasing his windmills. This is where in my imagination, I view the circles on the canvas as Luke's quest for the impossible dream and his circles are his windmills.  His paintbrush is his sword.  Thus Luke 
Van Hook's paintings, for me, exhibit all the romantic qualities innate in a love story.  Seeking to please his beloved Lucia, these references emerging from raw canvas could be read practically like text.  Some art collectors saw the circles as Braille text or some secret code or language.  The secret, I think, lies in Luke's love of sports!  Sometimes I interpret this circle code to reflect images of the sports activities I see Luke enjoy daily;  I make visual connections to the circles on the wheels of his bicycles which hang in his studio or his skate boards that decorate the rafters of the painting bays or even the wheels that drive his car which sits resting on almost perfect circles on the driveway.
For a while, I was convinced that Luke's enthusiasm for cycling was directly influencing the subjects of his paintings because one day, I was staring at one of his earlier images, (which is hung lovingly on the wall of the dining room right over the microwave oven); I saw it hanging next to a photograph of Luke participating in the 'Death Race 1999', a bicycle ride that cycle enthusiasts pursue along the most dangerous mountainous roads known as the California Alps in Northern California at the edge of the Northern Nevada border where Markleeville meets the Carson Valley.   The image Luke had painted in 1998, while recovering, ironically, from a broken ankle suffered in a bicycle race in Minden; was the image of three bicycles in a dead heat on the gray pavement with the yellow dividing line providing a ground for what appears as three large helmets (representative of the riders) in red, green and yellow.  The eventual emergence of Luke's hatch marks from work created in 2000, can be seen on the helmets and if you are really looking for this, (with your microscope) you may even find, the very beginnings of the influences which have eventually led to this mad case of circle paintings!  The circle imagery you might be searching for could have started at the base of the bicycle's anatomy with the wheels spinning along the highway to Kingsbury Grade, somewhere near Genoa, along the bottom of the hill leading to Lake Tahoe.  I comfort myself as painter's wife, that even Picasso had his periods, as did Rembrandt, Vincent Van Gogh, Monet and Gauguin and so long as Luke Van Hook doesn't try to cut off his ear we are doing just fine with these circles.

But don't take my word for it. Luke Van Hook's circle paintings are something you should see for yourself.  The subtlety of the work is difficult to capture on film, although I tried my best to create a video after struggling with photographing the stills for three years.  But even the video work fails to reveal the whole story.  You've got to stand in front of one of these pieces to involve yourself in the novella of Luke's life.  Although I can decode a small portion of what I see through his work, the rest of the circles on the canvas are still a vague mystery to me as well.  Every relationship has its secrets.  Thus Luke and I, as artists, are no different.  Even when we know each other, there are elements of surprise and adventure that we have yet to tell each other.  The mystery in his canvases is what really thrills me to see Luke's work on display under gallery lighting! (Sales don't hurt my enthusiasm either!)

When I think of Luke Van Hook's circle paintings, today, in 2008, I often think of Luke riding a skateboard doing 'ollies' and then trying for a loop-de-loop in mid-air.  This is because in January of 2008, Luke begged for a skateboard for his birthday and little did I know what would happen when I wrapped it up for him!  He has returned to the love of his youth.  Luke Van Hook has come full circle to his beginnings to land on his home base. The skateboard has also flown in mid-air, in harmony with gravity, and both land as one in a perfect execution of a move I would never dare try to do myself.  I see each circle on the canvas as Luke's attempt to catapult his work into the mainstream of the art-world with each rotation of the paintbrush on the surface of the canvas.  This is where I see Luke Van Hook in mid catapult, surfing on the air, light in transition, from youth to inspired maturity; from student to master, with paintbrush in hand landing and continuing to roll on four wheels with a great big shit-eating grin on his face. ('four' being the lucky number of his numerology charts). I see the ordered struggle, the innate joy in the success of one loop-de-loop after another. And once in a while, I also see the crash landing and the bloody injuries.  What is more important is that Luke gets up and does it again each and every time.  Luke has to begin again with each new circle, every circle becoming a part of a larger layer of community, thus his canvases vibrate with activity, mystery, romance and adventure.  I find my own meanings in each image  as it develops day by day and I am privileged to stand beside him, admire and witness the struggle of our Don Quixote in the new millennium, first hand.
There is still time to see these painting up close and personal. The Brand Library Art Galleries is part of the Glendale Public Library, located at 1601 West Mountain Street in the City of Glendale, 91201  Telephone:  818-548-2051/ fax 818-548-2713 ;  visit the Brand Library Art Galleries online at    to  check for Library hours.
Cookie Gallegos, Ana Porras and Martha Ingels attend the opening of "Circle in the Square" to support Luke Van Hook. Brand Art Library Galleries, Glendale, California August 2, 2008 Photo by Ginger Van Hook
(From left to right) Margo Payne, Lynn Nantana-Green and Angela Williams attend the exhibition "Circle in the Square" in support of Luke Van Hook.
Lynn Lantana-Green came to support Artist, Luke Van Hook at the opening reception of "Circle in the Square" an art exhibition held at the Brand Library Art Galleries, Glendale, California, August 2, 2008.  Photos by Ginger Van Hook
Kevin Powell came to support Luke Van Hook and enjoy the paintings at the Brand Library Art Galleries, Glendale California, August 2, 2008. Photo by Ginger Van Hook

Artist Luke Van Hook brought home-made pies to his reception of the exhibition "Circle in the Square". In addition to painting, Luke Van Hook has a reputation for making awesome pies from scratch. Photographed milling around the Double Fudge Pican Pie and the Sweet Berry Pie were the grandchildren of Hector Sticker. Brand Library Art Galleries, Glendale, California August 2, 2008. Photo by Ginger Van Hook
(From left to right) Claudio Sticker, Hector Sticker, Peter Bolten, Martha Ingels, Luke Van Hook and Luis Ingels attend the reception of  "Circle in the Square". Luke Van Hook and Luis Ingels worked together to create circles on canvas with the use of robotic CNC machines. After creating a little over a dozen machine-made paintings, Luke went on to compete with the machine and do the circles on his own by hand, one by one. Each circle is represented as being one breath and Luke Van Hook states that these are the marks he is leaving behind which define his existence during this lifetime as he continues to pursue the legend of "Giotto's Perfect Circle". Brand Library Art Galleries, Glendale, California, August 2, 2008. Photo by Ginger Van Hook

From left to right, Ohannes Berberian, his daughter Melanie, Luke Van Hook and Rouzanna Berberian attend the opening reception of "Circle in the Square" at the Brand Library Art Galleries, August 2, 2008.  Ohannes Berberian owns DigiTECH Camera Repair in Monrovia, California ( Luke Van Hook and Rouzanna Berberian are both fine art painters and members of the Monrovia Association of Fine Arts (M.A.F.A.). Rouzanna Berberian is a teacher in the after-school arts programs supported by M.A.F.A.  which promotes the goal of enhancing the lives of those within the community through interaction with the arts and to increase the opportunities of children through art education. Photo by Ginger Van Hook

From left to right, Kathleen Zgonc, photographer Frank Zgonc and artist Luke Van Hook attend the opening reception of 'Circle in the Square' at the Brand Library Art Galleries, August 2, 2008. Frank Zgonc is a an executive member of the Monrovia Association of Fine Arts in Monrovia, California. Frank Zgonc is the vice-president and official curator of Monrovia's yearly October Art Festival. This year the October Festival will be held on Saturday and Sunday October 11th and 12th, 2008 at the Monrovia Community Center located at 119 W. Palm Avenue in Monrovia. Free and open to the public, this art event will feature work by photographer Frank Zgonc; (Scheduled from 10 am to 6pm both days).  There will also be an Opening Night Celebration Saturday, October 11th from 7-9:30 pm where the special Renaissance Award will be presented to a worthy individual who has made significant contributions to the arts. 
Photo by Ginger Van Hook
Mr. and Mrs. Luke and Ginger Van Hook attend the opening reception of 'Circle in the Square' at the Brand Libraries Art Galleries, August 2, 2008 in Glendale, California.  Luke Van Hook an artist working from Inglewood, California earned a BFA  at Otis College of ARt and Design.  For several years, Van Hook has been exploring in his work, Giotto's fabled "perfect circle".  Over time the single-minded focus on the perfection of the circle has been subsumed by the artist's interest in the aesthetic and expressive qualities of the circle. New works depict ritualistically repeated circular brushstrokes on canvas, hemp, and other materials. Van Hook states that he began " as a challenge to myself to see if a perfect circle was possible; these circles have now morphed into a challenge to myself to see if a perfect circle is  possible. These circles have now morphed into a study in patience. The sense of time and the marking of time is inherent in the meticulous application of paint. The viewer can appreciate these temporal qualities but is also compelled to bring their own  interpretation to the work. Are these circles pure abstraction? Combined do they conceal deliberate shapes and forms? or are they perhaps a secret code or language? Van Hook has exhibited at TAG Gallery, Focus One Gallery, and the Bolsky Gallery in  Westchester. Luke Van Hook's painting may also be viewed on his website:
Photo courtesy of Peter Bolten

Kevin Powell comes to support Luke Van Hook for his opening reception. Brand Library Art Galleries, Glendale, California, August 2, 2008.  Photo by Ginger Van Hook
Jason Porras attends the opening reception to support Luke Van Hook in his endeavors to pursue Giotto's legend of the 'Perfect Circle'. Brand Library Art Galleries, Glendale, California August 2, 2008. Photo By Ginger Van Hook.

Zoe Hengst, Ginger Van Hook and Martha Ingels attend the opening of "Circle in the Square" to support Luke Van Hook. Brand Library Art Galleries, Glendale, California August 2, 2008. Photo courtesy of Peter Bolten.
Zoe and Jopie Hengst walk through the center of the exhibition "Circle in the Square" to support Luke Van Hook at the opening night, August 2, 2008. Paintings by Susan Sironi in the background. Brand Library Art Galleries, Glendale, California. Photo by Ginger Van Hook.

Cookie Gallegos, Ginger Van Hook and Luke Van Hook pose for photographs in front of Luke Van Hook's painting at the Brand Library Art Galleries, August 2, 2008 Glendale, California. Photo courtesy of Peter Bolten.

Cookie Gallegos and Ana Porras watch the dance performance choreographed by Cheryl Walker, Brand Library Art Galleries, August 2, 2008, Glendale, California.
Paintings by Yesung Kim, Brand Library Art Galleries, August 2, 2008, Glendale, California. Photo by Ginger Van Hook.
Paintings by Yesung Kim, Brand Library Art Galleries, August 2, 2008, Glendale, California.
Photo by Ginger Van Hook
Yesung Kim poses for a photograph in front of her paintings at the Brand Library Art Galleries, August 02, 2008, Glendale, California. Yesung Kim from Upland, California, was born in Seoul, South Korea and holds MFA degrees from Chung-Ang University and Claremont Graduate University. Kim's mixed media pieces are seductively simple. Ordinary brown packing string is deftly applied to a painted canvas creating organic shapes that shimmer and reflect light. At times these shapes appear to be on the brink of an amoeba-like division as they spread and expand, dropping off the edge of one canvas and continuing on to another. Kim  cites the natural world and light and color as the underlying themes that both inspire and permeate her work.  Following solo shows at the Seoul Museum of Art and the Seoul Arts Center, Kim's work was most recently exhibited at the San Bernardino County Museum's Multi Media Mini Show. More information about Kim's work can be found on her website:
Photo by Ginger Van Hook
Painting by Susan Sironi, Brand Library Art Galleries, August 2, 2008 Glendale, California.
Photo by Ginger Van Hook
Glass curtain by Susan Sironi, Brand Library Art Galleries, August 2, 2008,Glendale, California. Photo by Ginger Van Hook.
Cheryl Walker designed a curtain of vinyl layers of color called 'Waterfall IV' that became the backdrop for a beautiful dance performance using the 'circle in the square' theme exhibited at the Brand Library Art Galleries in Glendale California, August 2, 2008. Cheryl Walker holds in her hand some of the vinyl circles that were placed upon the windows at the exhibition hall. Her vinyl circles upon the windows created an illusion of  the stained glass effects. The dance piece entertained a large audience on opening night as artists, collectors, art appreciators and family and friends celebrated the mythologies, geometries, magical and mystical qualities of the circle.   Dance Performers Liz  Curtis, and Martha Carrascosa performed a dance which included participation from members of the audience.  
Members of the audience interacted with the dancers Martha Carrascosa and Liz Curtis at the Brand Library Art Galleries participated in creating a colorful cascade of window art on August 2, 2008 in Glendale, California.
Audience watches dancers Liz Curtis and Martha Carrascosa from Glendale Community College as they perform a choreographed piece by Cheryl Walker, artist. "Circle in the Square", Brand Library Art Galleries, Glendale California, August 2, 2008.  Photo By Ginger Van Hook
Dancers Liz Curtis and Martha Carrascosa performing dance choreographed by artist Cheryl Walker, (within the green curtain), Brand Library Art Galleries, Glendale, California. 
Photo by Ginger Van Hook.
Cheryl Walker engaged in performance art intersecting with window art using the artistic theme of 'Circle in the Square'. Brand Library Art Gallery, Glendale, CAlifornia August 2, 2008. Photo by Ginger Van Hook.

Cheryl Walker smiles happily on opening night, Brand Library Art Galleries, Glendale California. August 2, 2008. Cheryl Walker, a Los Angeles artist, earned her BA in art in her home state of Minnesota, and her MFA from California State University, Long Beach. In this exhibition Walker created two large site-specific installations of vinyl, oil pastel and natural and artificial light.  Walker explains that the driving force behind her work is "human interaction and improvisation in response to a natural phenomenon or situation." Trained as painter, Walker's installations have some of the qualities of painting; when viewed head-on the suspended layers of vinyl can appear to be two-dimensional because of their transparency and the cut shapes and forms applied to the vinyl are reminiscent of brushstrokes--but removed from the wall these works are thrust into what she calls an "interactive field of play." The fluidity of the material she works with and her interest in collaboration between the artist and the viewer have inspired Walker to create works that can be transformed into performance pieces by dance, music and in-situ art-making. In this exhibition, a dance performance captivates the audience on opening night at the Brand Library Art Galleries, Glendale, California. August 2, 2008.  Photos By Ginger Van Hook

Barbara Kolo, Artist from "Circle in the Square" poses for a photograph in front of her painting with her husband Mr. Kolo. Barbara Kolo, a Santa Monica Artist, earned her BFA from the School of Visual Arts in New York City. Kolo Participated in a successful two-person show at the Brand Library Art Galleries in 1999. The Brand Library Art Galleries are pleased to present (nearly ten years later) a new body of work by Barbara Kolo that connects to that which was here before. In those works and these, her focus is on representing organic materials. The current large scale acrylic on canvas works are saturated with color; the stippled application of paint creates organic shapes and patterns representative of the natural world.  The subject matter is open to each viewers interpretation, where one may see a birch forest at dusk, others may see the  bold aesthetic of pure color and abstraction. Kolo has had recent solo shows at Topanga Canyon Gallery and the Off Rose Gallery in Venice, California. More information about Kolo's work can be found on her website: Brand Library Art Galleries, Glendale California. Photo by Ginger Van Hook

Barbara Kolo poses for a photograph during opening night celebrations for the exhibition, "Circle in the Square" at the Brand Library Art Galleries, Augusts 2, 2008. Glendale, California.

Susan Sironi,  an artist living in Altadena, California posed for her photograph in front of her paintings at  the Brand Library Art Galleries, Glendale, California. August 2, 2008.  Susan Sironi earned her BFA at California Sate University, Long Beach. This exhibition will showcase Sironi's recent paintings as well as her Glass Curtain installation which is comprised of conjoined antique optometric lenses. Her paintings are about texture, color and process. Small dabs of oil paint are painstakingly applied to aluminum, building up an intricate, thorny surface. Highly textured and multihued when viewed up close, this surface belies the color play minimalist color-field appearance of the work at a distance . In the artist's own words "texture and color play equal roles in these works. They ... set up contradictions within each piece. Painitings  that seem to invite touch and intimacy are also reserved and automomous. Time and process are weighed against a static and minimal structure. Sironi's work was most recently seen in the Brea Art Gallery's Made in California exhibition, at the Chouinard School of Art Gallery, and the Orange County Center for Contemporary Art.  More information about Sironi's work can be found on her website:
Photo by Ginger Van Hook.  

Yesung Kim, Brand Library Art Gallery, Glendale, California, August 2, 2008.

The Entrance to the Brand Library Art Galleries in Glendale, California hosts a prominent postcard of the show "Circle in the Square" now exhibiting through September 5th, 2008

                   Luke Van Hook paintings are now showing at the Brand Library Art Galleries in
          SCSI support and a big surprise        
Last week I added SCSI disk support for the CD-i 60x extension board to CD-i Emulator. It took somewhat longer then I expected, though. This was mostly because the DP5380 SCSI controller chip exposes most low-level details of the SCSI protocol to the driver which means that all of these details have to be emulated.

The emulation ended up to be a more-or-less complete software implementation of the parallel SCSI-2 protocol, including most of the low-level signaling on the BSY, SEL, ATN, MSG, C/D-, I/O-, REQ and ACK lines. This is all implemented by the new CScsiBus class representing the SCSI bus that connects up to 16 instances of the CScsiPort class that each represent a single SCSI-2 bus interface. I was able to mostly avoid per-byte signaling of REQ and ACK if the target device implementation supports block transfers, a big performance win.

The new CCdiScsiDevice class emulates the DP5380 controller chip, working in conjunction with the CCdiScsiRamDevice and CCdiScsiDmaDevice classes that emulate the 32 KB of local extension SRAM and the discrete DMA logic around it that are included on the CD-i 60x extension board.

The CD-i 182 extension uses a compatible SCSI controller chip but a different DMA controller and has no local extension SRAM. I have not yet emulated these because I have almost no software to test it.

The new CScsiDevice class implements a generic SCSI device emulating minimal versions of the four SCSI commands that are mandatory for all SCSI device types: TEST UNIT READY, REQUEST SENSE, INQUIRY and SEND DIAGNOSTIC. It implements most of the boiler-plate of low-level SCSI signaling for target devices and the full command and status phases of SCSI command processing, allowing subclasses to focus on implementing the content aspects of the data transfer phase.

The CScsiFile class emulates a SCSI device backed by a file on the host PC; it includes facilities for managing the SCSI block size and the transfer of block-sized data to and from the backing file.

The CScsiDisk and CScsiTape classes emulate a SCSI disk and tape device, respectively, currently supporting a block size of 512 bytes only. Instances of these classes are connected to the SCSI bus by using the new
-s[csi]d[isk][0-7] FILE and -s[csi]t[ape][0-7] FILE options of CD-i Emulator.

The CD-i 60x extension board normally uses SCSI id 5; the built-in ROM device descriptors for SCSI disks use SCSI ids starting at zero (/h0 /h1 /h2) while the built-in device descriptor for a SCSI tape uses SCSI id 4 (/mt0). This means that the useful options with the 60x are -scsidisk0, -scsidisk1, -scsidisk2 and -scsitape 4.

I've added the new dsk subdirectory to contain disk images; tape images have no standard location as they are mostly intended for bulk-transfer purposes (see below).

Inside the CD-i player this leads to the following response to the built-in inquire command:
$ inquire -i=0
vendor identification:"CDIFAN CDIEMU SCSIDISK "

$ inquire -i=4
vendor identification:"CDIFAN CDIEMU SCSITAPE "
where the "CDIFAN " part is the vendor name and the "CDIEMU SCSIXXXX " part is the product name.

In the previous post I described a 450 MB OS-9 hard disk image that I found on the Internet. After mounting it with
-scsidisk0 mw.dsk I got the following output:
$ free /h0
"MediaWorkshop" created on: Feb 17, 1994
Capacity: 1015812 sectors (512-byte sectors, 32-sector clusters)
674144 free sectors, largest block 655552 sectors
345161728 of 520095744 bytes (329.17 of 496.00 Mb) free on media (66%)
335642624 bytes (320.09 Mb) in largest free block

$ dir -d /h0

Directory of /h0 23:49:36
ETC/ FDRAW/ FONTS/ FontExample/ ISP/
TEST/ USR/ VIDEO/ abstract.txt bibliographic.txt
bkgd.c8 bkgd.d cdb cdb1 cdb2
cdi_opt_install chris_test cin copyright.mws copyright.txt
csd_605 custominits_cin delme dos/ file
font8x8 get globs.mod go go.mkfont
inetdb ipstat kick1a_f.c8 kick2a_f.c8 mtitle
mws net new_shell new_shell.stb scratch
screen startup_cin thelist
You can see why thought it was a MediaWorkshop disc, but on closer inspection this turned out to something quite different. Some basic scrutiny lead to the hypothesis that this is probably a disk backup of someone from Microware working on early development of the DAVID (Digital Audio Video Interactive Decoder) platform. There are various surprises on the disk which I will describe below.

Anyway, I wanted to transfer the contents to the PC as a tar archive, similar to the procedure I used for my CD-i floppy collection. After starting CD-i Emulator with a -scsitape4 mw.tar option this was simply a matter of typing the following into the terminal window:
tar cb 1/h0
This command runs the "tape archiver" program to create a tape with the contents of the /h0 directory, using a tape blocking size of 1 (necessary because my SCSI tape emulation doesn't yet support larger block sizes). The resulting mw.tar file on the PC is only 130 MB, not 450 MB which indicates that the disk is mostly empty. At some point I might use an OS-9 "undelete" program to find out if there are additional surprises.

Extracting the mw.tar file was now a simple matter of running the PC command
tar xvf mv.tar
This produced an exact copy of the OS-9 directory structure and files on the PC.

Many of the directories on the hard disk are clearly copies of various distribution media (e.g. CDI_BASECASE, CINERGY, CURSORS, ENET, FONTS, ISP, MWOS, NFS). The contents of the ENET, ISP and NFS directories at first appear to match some of my floppies, including version numbers, but on closer inspection the binaries are different. Running some of them produces "Illegal instruction" errors so I suspect that these are 68020 binaries.

The SHIP directory contains some prerelease RTNFM software; the readme talks about PES which is a type of MPEG-2 stream (Packetized Elementary Stream). Various asset directories contain versions of a "DAVID" logo.

The CMDS directory contains working versions of the Microware C compiler, identical to the ones I already had and also many other programs. It also contains some "cdb" files (configuration database?) that mention the 68340 processor.

The contents of the CMDS/BOOTOBJS directory produced a first surprise: it contains a subdirectory JNMS containing among others files named "rb1793" and "scsijnms". Could this be floppy and SCSI drivers for the CD-i 182 extension, as it contains with a 1793 floppy drive controller (the CD-i 60x uses a different one) and the player has a "JNMS" serial number?

Well, yes and no. Disassembly of the scsijnms file proved it to be compiled C code using an interface different from OS-9 2.4 drivers, so I suspect this is an OS-9 3.x driver. In any case, I cannot use it with the stock CD-i 180 player ROMs. Bummer...

And now for the big surprise: deeply hidden in a directory structure inside the innocently named COPY directory is the complete assembly source for the VMPEG video driver module "fmvdrv". At first glance it looked very familiar from my disassembly exercises on the identically-named Gate Array 2 MPEG driver module "fmvdrv", which is as expected because I had already noticed the large similarity between these two hardware generations.

The source calls the VMPEG hardware the "IC3" implementation, which matches CD-i digital video history as I know it. The Gate Array MPEG hardware would be "IC2" and the original prototype hardware would be "IC1". Furthermore, the sources contain three source files named fmvbugs1.a to fmvbugs3.a whose source file titles are "FMV first silicon bugs routines" to "FMV third silicon bugs routines". The supplied makefile currently uses only fmvbugs3.a as is to be expected for a VMPEG driver.

The fmvbugs1.a source contains some of the picture buffer manipulation logic that I've so far carefully avoided triggering because I couldn't understand it from my disassemblies, and this is now perfectly understandable: they are workarounds for hardware bugs!

As of two hours ago, I have verified that with a little tweaking and reconstruction of a single missing constants library file these sources produce the exact "fmvdrv" driver module contained in the vmpega.rom file directly obtained from my VMPEG cartridge.

In general these sources are very heavily commented, including numerous change management comments. They also include a full set of hardware register and bit names, although no comments directly describing the hardware. This should be of great help in finally getting the digital video emulation completely working.

All of the comments are English, although a few stray words and developer initials lead me to believe that the programmers were either Dutch or Belgian.

Disassembly comparisons lead me to the conclusion that careful undoing of numerous changes should result in exact sources for the GMPEGA2 driver module "fmvdrv" as well. I might even do it at some point, although this is not high priority for me.

The disk image containing all of these surprises is publicly available on the Internet since at least 2009, which is probably someone's mistake but one for which I'm very grateful at this point!
          CD-i 180 adventures        
Over the last week I have been playing with the CD-i 180 player set. There’s lots to tell about, so this will be a series of blog posts, this being the first installment.

The CD-i 180 is the original CD-i player, manufactured jointly by Philips and Sony/Matsushita, and for a score of years it was the development and “reference” player. The newer CD-i 605 player provided a more modern development option but it did not become the “reference” player for quite some years after its introduction.

The CD-i 180 set is quite bulky, as could be expected for first-generation hardware. I have added a picture of my set to the Hardware section of the CD-i Emulator website; more fotos can be found here on the website (it’s the same player, as evidenced by the serial numbers).

The full set consists of the CDI 180 CD-i Player module, the CDI 181 Multimedia Controller or MMC module and the CDI 182 Expansion module. The modules are normally stacked on top of each other and have mechanical interlocks so they can be moved as a unit. Unfortunately, I do not have the CDI 182 Expansion module nor any user manuals; Philips brochures for the set can be found here on the ICDIA website.

Why am I interested in this dinosaur? It’s the first mass-produced CD-i player (granted, for relatively small masses), although there were presumably some earlier prototype players. As such, it contains the “original” hardware of the CD-i platform, which is interesting from both a historical and an emulation point of view.

For emulation purposes I have been trying to get hold of CD-i 180 ROMs for some years, there are several people that still have fully operational sets, but it hasn’t panned out yet. So when I saw a basic set for sale on the CD-Interactive forum I couldn’t resist the temptation. After some discussion and a little bartering with the seller I finally ordered the set about 10 days ago. Unfortunately, this set does not include a CDI 182 module or pointing device.

I had some reservations about this being a fully working set, but I figured that at least the ROM chips would probably be okay, if nothing else that would allow me to add support for this player type to CD-i Emulator.

In old hardware the mechanical parts are usually the first to fail, this being the CDI 180 CD-i Player module (which is really just a CD drive with a 44.1 kHz digital output “DO” signal). A workaround for this would be using an E1 or E2 Emulator unit; these are basically CD drive simulators that on one side read a CD-i disc image from a connected SCSI hard disk and on the other side output the 44.1 kHz digital output “DO” signal. Both the CDI 180 and E1/E2 units are controlled via a 1200 baud RS232 serial input “RS” signal.

From my CD-i developer days I have two sets of both Emulator types so I started taking these out of storage. For practical reasons I decided to use an E1 unit because it has an internal SCSI hard disk and I did not have a spare one lying around. I also dug out an old Windows 98 PC, required because the Philips/OptImage emulation software doesn’t work under Windows XP and newer, and one of my 605 players (I also have two of those). Connecting everything took me a while but I had carefully stored all the required cables as well and after installing the software I had a working configuration after an hour or so. The entire configuration made quite a bit of mechanical and fan noise; I had forgotten this about older hardware!

I had selected the 605 unit with the Gate Array AH02 board because I was having emulation problems with that board, and I proceeded to do some MPEG tests on it. It turns out the hardware allows for some things that my emulator currently does not, which means that I need to do some rethinking. Anyway, on with the 180 story.

In preparation for the arrival of the 180 set I next prepared an disc image of the “OS-9 Disc” that I created in November 1993 while working as a CD-i developer. This disc contains all the OS-9 command-line programs from Professional OS-9, some OS-9 and CD-i utilities supplied by Philips and Microware and some homegrown ones as well. With this disc you can get a fully functional command-line prompt on any CD-i player with a serial port, which is very useful while researching a CD-i player’s internals.

The Philips/Optimage emulation software requires the disc image files to include the 2-second gap before logical block zero of the CD-i track, which is not usually included in the .bin or .iso files produced by CD image tools. So I modified the CD-i File program to convert my existing os9disc.bin file by prepending the 2-second gap, in the process also adding support for scrambling and unscrambling the sector data.

Scrambling is the process of XORing all data bytes in a CD-ROM or CD-i sector with a “scramble pattern” that is designed to avoid many contiguous identical data bytes which can supposedly confuse the tracking mechanism of CD drives (or so I’ve heard). It turned out that scrambling of the image data was not required but it did allow me to verify that the CD-I File converted image of a test disc is in fact identical to the one that the Philips/Optimage mastering tools produce, except for the ECC/EDC bytes of the gap sectors which CD-I File doesn’t know how to generate (yet). Fortunately this turned out not to be a problem, I could emulate the converted image just fine.

Last Thursday the 180 set arrived and in the evening I eagerly unpacked it. Everything appeared to be in tip-top shape, although the set had evidently seen use.

First disappointment: there is no serial port on the right side of 181 module. I remembered that this was actually an option on the module and I had not even bothered to ask the seller about it! This would make ROM extraction harder, but I was not completely without hope: the front has a Mini-DIN 8 connector marked “CONTROL” and I fully expected this to be a “standard” CD-i serial port because I seemed to remember that you could connect standard CD-i pointing devices to this port, especially a mouse. The built-in UART functions of the 68070 processor chip would have to be connected up somewhere, after all.

Second disappointment: the modules require 120V power, not the 220V we have here in Holland. I did not have a voltage converter handy so after some phone discussion with a hardware-knowledgeable friend we determined that powering up was not yet a safe option. He gave me some possible options depending on the internal configuration so I proceeded to open up the CDI 181 module, of course also motivated by curiosity.

The first thing I noticed was that there were some screws missing; obviously the module had been opened before and the person doing it had been somewhat careless. The internals also seemed somewhat familiar, especially the looks of the stickers on the ROM chips and the placement of some small yellow stickers on various other chips.

Proceeding to the primary reason for opening up the module, I next checked the power supply configuration. Alas, nothing reconfigurable for 220V, it is a fully discrete unit with the transformer actually soldered to circuit board on both input and output side. There are also surprisingly many connections to the actual MMC processor board and on close inspection weird voltages like –9V and +9V are printed near the power supply outputs, apart from the expected +5V and +/–12V, so connecting a different power supply would be a major undertaking also.

After some pondering of the internals I closed up the module again and proceeded to closely inspect the back side for serial numbers, notices, etc. They seemed somewhat familiar but that isn’t weird as numbers often do. Out of pure curiosity I surfed to the website to compare serial numbers, wanting to know the place of my set in the production runs.

Surprise: the serial numbers are identical! It appears that this exact set was previously owned by the owner of that website or perhaps he got the photographs from someone else. This also explained why the internals had seemed familiar: I had actually seen them before!

I verified with the seller of the set that he doesn’t know anything about the photographs; apparently my set has had at least four owners, assuming that the website owner wasn’t the original one.

On Friday I obtained a 120V converter (they were unexpectedly cheap) and that evening I proceeded to power up the 180 set. I got a nice main menu picture immediately so I proceeded to attempt to start a CD-i disc. It did not start automatically when I inserted it, which on second thought makes perfect sense because the 181 MMC module has no way to know that you’ve just inserted a disc: this information is not communicated over 180/181 interconnections. So I would need to click on the “CD-I” button to start a disc.

To click on a screen button you need a supported pointing device, so I proceeded to connect the trusty white professional CD-i mouse that belongs with my 605 players. It doesn’t work!

There are some mechanical issues which make it doubtful that the MiniDIN connector plugs connect properly, so I tried an expansion cable that fit better. Still no dice.

The next step was trying some other CD-i pointing devices, but none of them worked. No pointing devices came with the set, and the seller had advised me thus (they were presumable lost or sold separately by some previous owner). The only remaining option seemed to be the wireless remote control sensor which supposedly uses RC5.

I tried every remote in my home, including the CD-i ones, but none of them give any reaction. After some research into the RC5 protocol this is not surprising, the 180 set probably has a distinct system address code. Not having a programmable remote handy nor a PC capable of generating infrared signals (none of my PCs have IrDA) I am again stuck!

I spent some time surfing the Internet looking for RC5 remotes and PC interfaces that can generate RC5 signals. Programmable remotes requiring a learning stage are obviously not an option so it will have to be a fully PC-programmable remote which are somewhat expensive and I’m not convinced they would work. The PC interface seems the best option for now; I found some do-it-yourself circuits and kits but it is all quite involved. I’ve also given some thought to PIC kits which could in principle also support a standard CD-i or PC mouse or even a joystick, but I haven’t pursued these options much further yet.

Next I went looking for ways to at least get the contents of the ROM chips as I had determined that these were socketed inside the MMC module and could easily be removed. There are four 27C100 chips inside the module, each of which contains 128Kb of data for a total of 512Kb which is the same as for the CD-i 605 player (ignoring expansion and full-motion video ROMs). The regular way to do this involves using a ROM reading device, but I haven’t gotten one handy that supports this chip type and neither does the hardware friend I mentioned earlier.

I do have access to an old 8 bit Z80 hobbyist-built system capable of reading and writing up to 27512 chips which are 64Kb, it is possible to extend this to at least read the 27C100 chip type. This would require adapting the socket (the 27512 is 28 pins whereas the 27C100 has 32 pins) and adding one extra address bit, if nothing else with just a spare wire. But the Z80 system is not at my house and some hardware modifications to it would be required, for which I would have to inspect the system first and dig up the circuit diagrams; all quite disappointing.

While researching the chip pinouts I suddenly had an idea: what if I used the CD-i 605 Expansion board which also has ROM sockets? This seemed an option but with two kids running around I did not want to open up the set. That evening however I took the board out of the 605 (this is easily done as both player and board were designed for it) and found that this Expansion board contains two 27C020 chips, each containing 256Kb of data. These are also 32 pins but the pinouts are a little different, so a socket adapter would also be needed. I checked the 605 technical manual and it did not mention anything about configurable ROM chip types (it did mention configurable RAM chip types, though) so an adapter seemed the way to go. I collected some spare 40 pin sockets from storage (boy have I got much of that) and proceeded to open up the 180 set and take out the ROM chips.

When determining the mechanical fit of the two sockets for the adapter I noticed three jumpers adjacent to the ROM sockets of the expansion board and I wondered… Tracing of the board connections indicated that these jumpers were indeed connected to exactly the ROM socket pins differing between 27C100 and 27C020, and other connections indicated it at least plausible for these jumpers to be exactly made for the purpose.

So I changed the jumpers and inserted one 180 ROM. This would avoid OS-9 inadvertently using data from the ROM because only half of each 16-bit word would be present, thus ensuring that no module headers would be detected, and in the event of disaster I would lose only a single ROM chip (not that I expected that to be very likely, but you never know).

Powering up the player worked exactly as expected, no suspicious smoke or heat generation, so the next step was software. It turns out that CD-i Link already supports downloading of ROM data from specific memory addresses and I had already determined those addresses from the 605 technical manual. So I connected the CD-i 605 null-modem cable with my USB-to-Serial adapter between CD-i player and my laptop and fired off the command line:

cdilink –p 3 –a 50000 –s 256K –u u21.rom

(U21 being the socket number of the specific ROM I chose first).

After a minute I aborted the upload and checked the result, and lo and behold the u21.rom file looked like an even-byte-only ROM dump:
00000000  4a00 000b 0000 0000 0004 8000 0000 0000 J...............
00000010 0000 0000 0000 003a 0000 705f 6d6c 2e6f .......:..p_ml.o
00000020 7406 0c20 0000 0000 0101 0101 0101 0101 t.. ............
This was hopeful, so I restarted the upload again and waited some six minutes for it to complete. Just for sure I redid the upload from address 58000 and got an identical file, thus ruling out any flakey bits or timing problems (I had already checked that the access times on the 27C100 and 27C020 chips were identical, to say 150ns).

In an attempt to speed up the procedure, I next attempted to try two ROMs at once, using ones that I thought not to be a matched even/odd set. The 605 would not boot! It later turned out that the socket numbering did not correspond to the even/odd pairing as I expected so this was probably caused by the two ROMs being exactly a matched set and OS-9 getting confused as the result. But using a single ROM it worked fine.

I proceeded to repeat the following procedure for the next three ROMs: turn off the 605, remove the expansion board, unsocket the previous ROM chip, socket the next ROM chip, reinsert the expansion board, turn on the 605 and run CD-i Link twice. It took a while, all in all just under an hour.

While these uploads were running I wrote two small programs rsplit and rjoin to manipulate the ROM files into a correct 512Kb 180 ROM image. Around 00:30 I had a final cdi180b.rom file that looked good and I ran it through cditype –mod to verify that it indeed looked like a CD-I player ROM:
  Addr     Size      Owner    Perm Type Revs  Ed #  Crc   Module name
-------- -------- ----------- ---- ---- ---- ----- ------ ------------
0000509a 192 0.0 0003 Data 8001 1 fba055 copyright
0000515a 26650 0.0 0555 Sys a000 83 090798 kernel
0000b974 344 0.0 0555 Sys 8002 22 b20da9 init
0000bacc 2848 0.0 0555 Fman a00b 35 28611f ucm
0000c5ec 5592 0.0 0555 Fman a000 17 63023d nrf
0000dbc4 2270 0.0 0555 Fman a000 35 d6a976 pipeman
0000e4a2 774 0.0 0555 Driv a001 6 81a3e9 nvdrv
0000e7a8 356 0.0 0555 Sys a01e 15 e69105 rp5c15
0000e90c 136 0.0 0555 Desc 8000 1 f25f23 tim070
0000e994 420 0.0 0555 Driv a00c 6 7b3913 tim070driv
0000eb38 172 0.0 0555 Driv a000 1 407f81 null
0000ebe4 102 0.0 0555 Desc 8000 2 cf450e pipe
0000ec4a 94 0.0 0555 Desc 8000 1 f54010 nvr
0000eca8 96 0.0 0555 Desc 8000 1 17ec68 icard
0000ed08 1934 0.0 0555 Fman a000 31 b41f17 scf
0000f496 120 0.0 0555 Desc 8000 61 dd8776 t2
0000f50e 1578 0.0 0555 Driv a020 16 d0a854 u68070
0000fb38 176 0.1 0777 5 8001 1 a519f6 csd_mmc
0000fbe8 5026 0.0 0555 Sys a000 292 e33cc5 csdinit
00010f8a 136 0.0 0555 Desc 8000 6 041e2b iic
00011012 152 0.0 0555 Driv a02c 22 e29688 ceniic
000110aa 166 0.0 0555 Desc 8000 8 c5b823 ptr
00011150 196 0.0 0555 Desc 8000 8 a0e276 cdikeys
00011214 168 0.0 0555 Desc 8000 8 439a33 ptr2
000112bc 3134 0.0 0555 Driv a016 11 faf88d periic
00011efa 4510 0.0 0555 Fman a003 96 a4d145 cdfm
00013098 15222 0.0 0555 Driv a038 28 122c79 cdap18x
00016c0e 134 0.0 0555 Desc 8000 2 35f12f cd
00016c94 134 0.0 0555 Desc 8000 2 d2ce2f ap
00016d1a 130 0.0 0555 Desc 8000 1 1586c2 vid
00016d9c 18082 10.48 0555 Trap c00a 6 5f673d cio
0001b43e 7798 1.0 0555 Trap c001 13 46c5dc math
0001d2b4 2992 0.0 0555 Data 8020 1 191a59 FONT8X8
0001de64 134 0.0 0555 Desc 8000 2 c5ed0e dd
0001deea 66564 0.0 0555 Driv a012 48 660a91 video
0002e2ee 62622 0.1 0555 Prog 8008 20 ec5459 ps
0003d78c 4272 0.0 0003 Data 8001 1 9f3982 ps_medium.font
0003e83c 800 0.0 0003 Data 8002 1 c1ac25 ps_icons.clut
00040000 2976 0.0 0003 Data 8002 1 0a3b97 ps_small.font
00040ba0 7456 0.0 0003 Data 8002 1 764338 ps_icons.clu8
000428c0 107600 0.0 0003 Data 8002 1 7b9b4e ps_panel.dyuv
0005cd10 35360 0.0 0003 Data 8001 1 2a8fcd ps_girl.dyuv
00065730 35360 0.0 0003 Data 8002 1 e1bb6a ps_mesa.dyuv
0006e150 35360 0.0 0003 Data 8002 1 8e394b ps_map.dyuv
00076b70 35360 0.0 0003 Data 8002 1 c60e5e ps_kids.dyuv

File Size Type Description
------------ ------ ------------ ------------
cdi180b.rom 512K cdi000x.rom Unknown CD-i system ROM
cdi180b.rom 512K cdi000x.mdl Unknown CD-i player
cdi180b.rom 512K unknown.brd Unknown board
Of course cditype didn’t correctly detect the ROM, player and board type, but the list of modules looks exactly like a CD-i player system ROM. It is in fact very similar to the CD-i 605 system ROM, the major differences are the presence of the icard and *iic drivers, the absence of a slave module and the different player shell (ps module with separate ps_* data modules instead of a single play module).

It being quite late already, I resocketed all the ROMs in the proper places and closed up both players, after testing that they were both fully functional (insofar as I could test the 180 set), fully intending to clean up and go to bed. As an afterthought, I took a picture of the running 180 set and posted it on the CD-Interactive forums as the definitive answer to the 50/60 Hz power question I’d asked there earlier.

The CD-i Emulator urge started itching however, so I decided to give emulation of my new ROM file a quick go, fully intending to stop at any major problems. I didn’t encounter any of those, however, until I had a running CD-i 180 player three hours later. I reported the fact on the CDinteractive forum, noting that there was no pointing device or disc access yet, and went to a well-deserved sleep. Both of these issues are major ones and those I postponed for the next day.

To get the new player type up and running inside CD-i Emulater, I started by using the CD-i 605 F1 system specification files cdi605a.mdl and minimmc.brd as templates to create the new CD-i 180 F2 system files cdi180b.mdl and maximmc.brd. Next I fired up the emulator and was rewarded with bus errors. Not unexpected and a good indicator of where the problems are. Using the debugger and disassembler I quickly determined that the problems were, as expected, the presence of the VSR instead of VSD and the replacement of the SLAVE by something else. Straightening these out took a bit of time but it was not hard work and very similar to work I had done before on other player types.

This time at least the processor and most of the hardware was known and already emulated; for the Portable CD-i board (used by the CD-i 370, DVE200 and GDI700 players) both of these were not the case as they use the 68341 so-called integrated CD-i engine which in my opinion is sorely misnamed as there is nothing CD-i about the chip, it is just the Motorola version of an 68K processor with many on-chip peripherals in remarkably similar to the Philips 68070 in basic functionality.

Saturday was spent doing household chores with ROM research in between, looking for the way to get the pointing device working. It turned out to be quite involved but at the end of the day I had it sort of flakily working in a kludgy way; I’ll report the details in a next blog post.

Sunday I spent some time fixing the flakiness and thinking a lot about fixing the kludginess; this remains to be done. I also spent time making screenshots and writing this blog post.

So to finish up, there is now a series of 180 screenshots here on the CD-i Emulator website as reported in the What's New section. A very nice player shell, actually, especially for a first generation machine.

I will report some ROM and chip finds including new hopes for replacing the missing pointing device in a next blog post.
          Another Blog Hop         
Many of you may have noticed that there seems to be a blog hop circulating around at the moment. The lovely Tea from Tea Okereke chose me to continue the 'Hop!' So here I go.
Photo: Michael Dooney Post: My Cut Out Lace Dress Challenge

Why do you write?
Writing is not exactly a strength of mine, I am generally envious of many bloggers writing styles. I think that my structured, science-y brain makes me a little dry. But there are a number of reasons for writing my blog
The main reason is to document my sewing process and hopefully encourage others to take up the hobby. I am a firm believer in sustainable fashion and believe that understanding the making of a garment (from sourcing fabrics, to constructing and finishing details) encourages us to have some insight into clothing production.
Writing the blog also pushes me to have self imposed deadlines for my sewing. I have realised that I need these deadlines, otherwise I get easily distracted! Having a blog which focuses on sewing and creativity is a wonderful thing for my overly excited brain!

What are you working on?
At the moment I am working on opening and revamping my Fickle Sense Etsy store (I have 2 shops one for fashion (Fickle Sense) and one for screen printing(FS Screen Printing))In the Fickle Sense store I am combining my loves for character design, illustration, textile design, screen printing, sewing and sustainable fashion. Having my own fashion label has been a dream of mine.... so hopefully all of my hard work pays off. I am making handmade, organic pyjamas! I have not been posting too much on the Fickle Sense blog as I have been drawing, screen printing, sewing prototypes and sewing my stock for the past few months. This image is a sneak peak into what you can expect. I will hopefully have the shop launched in about 2 weeks! The theme is English Breakfast!

How does your blog differ from others of its genre?
I don't know how much I differ from other sewing type blogs, but I can tell you how I like to work... perhaps this makes me a little different?

My husband and I have a combined love for photography so our photo taking process is very planned out. For 95% of our images we use medium format cameras (either the Yashicha or the Mamiya) and take 5 - 10 frames each shoot  Sometimes we have an idea of what we want the image to look like (e.g. For the photos in the snowy the picture above 'My Cut Out Lace Challenge' I knew that it was going to snow the next morning, so we got up early before work and went to take photos in the fresh snow, when the snow clung to the trees). Or sometimes we take a day trip somewhere so we take photos there. We get the film developed by a one man, local lab. My husband then scans the films for me. So it is quite a long process compared to digital.
I am also a lover of textile design, so I often create my own prints and textiles. I am hoping to be more experimental and artistic with my outfits in the coming months. So there are more textiles to come!
I also have a true passion for sustainable fashion. So much so, I have created a website named 'i give 2 hoots' which focuses on sustainable fashion. I am revamping the site ready for more inspiring bloggers. Find out more here.


How does your writing process work?
I  have tried to set dedicated times to blog. I was inspired to try this out as many artists such as Nick Cave block out times for writing. This was not so successful for me. I found that I like to jump between projects (e.g. Knitting, crocheting, writing, sewing, drawing, printing) depending on my mood. I generally need to be on a 'writing roll' and I will then write a heap of posts at one time. 

Thanks to Tea for mentioning me in her Hop. Tea really creates some lovely, colourful garments and I enjoy the stories that go along side her garments on Tea Okereke. I particularly liked this neon pink number below. I was first drawn to it because of the parrot print (I am a bird lover), but then reading deeper into her story, it turned out to be a useful outfit for an archeologist :) You can read her reasoning here.
Image Tea Okereke
Now the next two Hops are going to ..... Meg from Made By Meg and Heather Lou from The Closet Case Files.
Made by Meg, must be the most hard working sewer out there. There are always newly sewn garments featured on her blog with reviews. Certainly one to follow. I also like that she sews for her man. I am a fan of menswear tailoring (I would love to do a tailoring course) and sewing for my husband, so I love seeing others sewing for the special man in their lives. My favourite outfit of hers is the summer bustier

Image from Meg by Made

Heather Lou is also an inspiring blogger. My favourite post of hers was a rather personal one, Taking a Leap. This post discusses her new career change where she now makes her own indie patterns for sewers to create; Bombshell Swimsuit, Nettie Dress and Body Suit, Ginger Skinny Jeans. What a brave soul. I also love that she is a true sewing community member where she often writes about others projects. This image below is my favourite outfit of hers: Sallie Silk in Shigawake

Image from Closet Case Files
Blog on!

          S12 Club New design – lightweight and efficient plywood racing dinghy        
A new lightweight 12ft plywood dinghy under development. This prototype by Canadian Rick Landreville. It won't have a cloud of sail area,but will be light enough low drag enough to hit consistently high speeds relative to other boats of this length.
          Solid Concepts Inc. Provides 3D Printing for Luxury Sports Car        

Solid Concepts Inc. provided 3D printed and cast urethane prototype and pre-production components for Equus Automotive’s BASS770, a luxury American muscle car.

(PRWeb April 22, 2014)

Read the full story at

          Solid Concepts Opens Up Selective Laser Sintering (SLS) Capacity        

Solid Concepts has optimized machine space on their SLS platforms so as to offer customers low volume manufacturing of prototypes.

(PRWeb September 10, 2013)

Read the full story at

          PolyJet Over-Mold Reinvents 3D Printed Prototypes        

Solid Concepts’ latest process introduction, PolyJet Over-Mold, offers customers even greater design advantages. The new PolyJet technology allows multiple materials to be utilized in a single build, allowing for over-molded parts as well as parts built from a variety of durometers that more realistically simulate production parts.

(PRWeb March 12, 2013)

Read the full story at

          Solid Concepts is the 2nd Largest FDM Service Provider in North America        

Solid Concepts, a custom manufacturing company, continues to grow capacity to meet customer demand for additive manufactured products. Fused Deposition Modeling, or FDM, is excellent for prototypes and end-use products because it builds with production-grade thermoplastic material.

(PRWeb February 28, 2013)

Read the full story at

          Daytona 24 Hours (VTR iRacing stream)        
It's been 5 and half years since Vader™ Trophy Racing closed the doors of its successful GTP endurance racing programme, but for one race only WE'RE BACK*!

There's been thoughts of a reunion race of some sort for a couple of months now, and as iRacing's Road Warrior Series kicks off with the Daytona 24 hours later today, we thought it the ideal opportunity to make it happen.

The race consists of:
    * HPD ARX-01c prototype (our chosen, particularly fragile torpedo)
    * Ruf RT 12R (a tweaked Porsche 911)
    * BMW Z4 GT3
    * McLaren MP4-12c GT3
    * Ford GT

We're all British for this race, as Neil Pearson's pedals are broken, Esben Tipple's too busy until the summer and Jannis Koopmann ran off years ago. No Ian Woollam or Dave Ellis** either, sadly. A four man, one car team consisting of Neil Stratton, Richard Dickson, Graham Bridgett and myself.

With Rich starting, I'm doing the second stint at around 2:40 to 3:30pm, and without our resident Aussie (Neil Pearson), I picked the loose straw and also get to do the graveyard stint at 1:30 to 3:10am, and then 5:40 to 6:30 am. If all goes to plan, Rich will be bringing across the line at 1pm on Sunday. All times GMT.

The race itself kicks off today at 1pm GMT, and I've set up a Twitch account to stream the bits where I'm awake (even when I'm not driving). For those interested in tuning in:

Team: Vader Trophy Racing
Time: 1pm start until around 5pm, then 1:30am until chequered flag (GMT)
Notes: Teamspeak on, language warning, British humour possible

iRacing's 24 Hour promo video:

* There may be other races if we felt this one was more fun than pain. ;)

** Some may have tuned in to our radio broadcast the last time we raced at Daytona, around 8 years ago. Dave and I shared a car, and together with our other car, spent the entire race taking the proverbial out of the newly formed and rubbish Russian 36 Team, who, it later turned out, had been listening to our radio all along...
          The Decay of Digital Things        
With Andrew Lovett-Barron and Maryanna Rogers, I taught a pop-up studio workshop called “The Decay of Digital Things” at Stanford’s this May. Using examples from iPhone operating systems to aged and eccentric artificial intelligences, we made speculative prototypes exploring future deaths and afterlifes for computational objects.
          6 Companies to Build Deep Space Habitat Prototypes Under NASA Program        
Six companies aim to build ground prototypes of deep space habitat concepts through NASA‘s Next Space Technologies for Exploration Partnerships-2 program. NASA said Friday prototypes from Bigelow Aerospace, Boeing, Lockheed Martin, Orbital ATK, Sierra Nevada Corp. and NanoRacks will help the agency assess configurations of proposed habitats and support tests that will validate standards and common interfaces currently under consideration. All […]
          Tesla Model 3... It's here and it looks mostly finished! All right!        

Tesla unveiled its much ballyhooed Model 3 to an invite only crowd and via live stream this past Thursday.  Looks like it will be a winner because as much as I want to hate it, I really want to drive one.  And that’s right, I said ballyhooed! 

The unveiling started out like any other with Tesla’s CEO, uber-techy Elon Musk, being introduced by the car’s design boss Franz von Holzhausen (who has the dual distinction of not only being the lead designer on what could be one of the most ground breaking vehicles in history AND the owner of perhaps the most German sounding name EVER).  Musk stalled the big reveal by discussing how we’re killing the planet and poisoning ourselves with exhaust fumes.  He also reviewed his master plan for shifting the paradigm of sustainable personal transportation – and how their new big factory will be making a lot of batteries.  Seriously, like, all the batteries. 

But let’s take a moment to look at Musk’s presentation style.  It’s relatively unpolished and full of little self-deprecating jabs at the company’s history of production issues.  This must make him pretty approachable because the audience felt free to yell stuff at him fairly frequently.  It all helps cement Musk’s public persona as a young, plucky, success-by-the-skin-of-his-teeth type that contrasts against the public’s idea of what automotive executives are like at those other car companies.  However, he is 44 now and it does seem somewhat man-childish.  But he also has a net worth of over $13 Billion, so what the hell do I know?

He also went over the minimum specs:

0-60 in under 6 seconds.

Seating for 5 adults.

A trunk and a frunk offering more storage capacity than any other four door sedan.

215 mile range per charge.


Supercharging capabilities (for the battery, not for the motor).

5-Star safety rating in all categories. 

Sounds good to me.  Also, keep in mind that the batteries in these things sit on the bottom of the car, essentially making up its floor, so its centre of gravity is lower than 700lb dwarf and more evenly distributed than the soul deadening sorrow experienced by a Burger King night shift crew.  So yes, it will likely out handle your C-Class, 3 Series or A4.  Soundly.     

Now let’s finally get to the car!  The reveal happened just like any other new car debut, thumping beats, flashing lights, a vague but captivating intro video and three examples rolling onstage to thunderous applause.  Overall, I would say it has the looks to back up its impressive specs. 

It has a nice compact exterior somewhat reminiscent of the Mazda 3.  It sports some serious multi-spoke wheels shod with low profile tires that wouldn’t look out of place on a BMW… or the afore mentioned Mazda.  In fact, may I be so bold as to say the new Model 3 could pass as a slightly less swoopy sibling to the Mazda 3? 

Is it also a coincidence that both cars have essentially the same name?? 


Mazda 3 = Model 3

Now take away the words starting with “M”. 

3 = 3


And now a side by side, paying particular attention to where the C-Pillar meets the quarter panel:


Startling indeed.

Also, while reading about this car it was pointed out by at least one observant automotive commentator that the trunk probably should have been designed as a hatch instead of a notch.  Why boast about segment leading storage capacity and then limit access to it with a tiny opening?!!?  Why!?!?  It would be like boasting how the plumbing system you designed can flush a whole watermelon but then throwing a standard sized toilet on it.  Sure it can accommodate the problems associated with a binge night of double cheese pizza and all you can eat Indian buffet, but only in theory.  

Another thing.  What the hell happened to this things face?  Tesla, you forgot the grill.  I know it’s an electric car and it doesn’t NEED one.  But the car is designed like it’s supposed to have one and they just didn’t put it on.  Like they got to that point in the design, took a break and when they came back they all got sidetracked and moved on to something else.  It’s the most off-putting thing I’ve ever seen on an otherwise acceptably attractive vehicle.

That’s right, off-putting.  For example:


Or even more disturbingly:


See what I mean?  Ken doll syndrome.   

Ok, so I know it won’t have any effect on people buying it.  Hell, they already have almost a quarter million orders for the thing.  But why was it saddled with this loose end??  The prototype designs that hit the internet all had grills.  The Model S has a grill, right?  So what the frack?! 

To be honest I’m only acting harsh to this thing because it’s so awesome.  If I had $35K to spend on a car, I would likely get this one.  But I don’t, so I’m going to complain about it so I feel better.

But damn if that beak isn’t freaky.  So like a 24 hour marinated T-Bone, I would say this car needs to be grilled up STAT.    


That’s better. 




          Back in Business        
It is kind of astonishing to realize that it has been over a year since I updated this blog.  Life has been, uh, busy. Primarily, what I’ve been doing is opening a studio, building a team, and developing next-gen prototypes.  Legally, I can’t talk about any of the details, but I am very grateful to […]
          The Sony Future Lab could be as forward-thinking as the Walkman        

 Sony is doing something unheard of in the consumer electronic space: It’s going to show off prototypes. Can you imagine Apple or Samsung doing that? Not really, no. For several ...

The post The Sony Future Lab could be as forward-thinking as the Walkman appeared first on Derinmavi.

          Summer internships at the WTDTF and Duke Energy eGRID available for undergraduates        
CURI is interviewing for undergraduate internship positions to give students the opportunity to work with the testing of next generation wind turbine prototypes and a unique industrial scale electric grid test bed this summer ...
          Lockheed Martin provides details into use of Shuttle-era cargo pods for proposed cislunar habitat        

Lockheed Martin was recently selected by NASA to build a full-scale prototype of a cislunar habitat. The development of the habitation module is part of the Phase II contract for the Next Space Technologies for Exploration Partnerships (NextSTEP) program.

The post Lockheed Martin provides details into use of Shuttle-era cargo pods for proposed cislunar habitat appeared first on SpaceFlight Insider.

          Episode 41 - Prototrek        
itunes pic
Well we're back (and we made pretty decent time too!) and we're here to complete our review of Prototype. (sadly we forgot to put in the actual scores after an hour of ranting-find it on Tell us what you think about the opening, it's been a while since we've done one. As for the comic and the "Trek" part- I went through the whole thing geeked out, wearing the uniform and everything... for no particular reason at all... The comic is one I made when the Star Trek movie came out, decided to take advantage. I've got some Prototype comics in mind so keep an eye on the ning we got, I'll be posting them there.
          Hallo, Mogen Wij Ons Voorstellen?        

Thierry Baudet en Sander Ruijter

Het overgrote deel van de Nederlandse volksvertegenwoordiging is onbekend en daardoor onbemind. Met de lancering van de website ‘’ - woensdag, 14 maart, wordt de kloof tussen politiek en burgers digitaal gedicht. Met één trefwoord weet je precies bij wie je het beste rechtstreeks kan aankloppen. Sander Ruijter (26) en Thierry Baudet (24) werkten twee jaren aan een persoonlijke kennismaking met de leden van de Tweede Kamer, en ontmoetten ‘interessante levensgenieters’, die “keihard werken en elke dag namens ons belangrijke beslissingen nemen”.

Tekst & fotografie Iwan Brave

Ruijter en Baudet zijn het prototype jonge honden. Hoewel van gegoede huize, laten ze zich niets aan komen waaien. Ze zijn druk bezet maar niet gesjeesd. Maar bovenal zelfverzekerd. “Een goed idee verkoopt zichzelf”, zegt Ruijter. Toch hebben ze twee jaar keihard gewerkt aan hun geesteskindje, waarbij ze alle 150 kamerleden de kleren van het lijf hebben gevraagd.

Ze moesten qua fondswerving alle zeilen bijzetten. Uiteindelijk kregen ze twee fondsen over de streep. Nederland Kennisland, die nieuwe internetprojecten stimuleert, en het Forum voor Democratische Ontwikkeling (FDO). “De fondsen in Nederland zijn over het algemeen heel behoudend ingesteld”, zegt Ruijter. “Het zijn vaak nalatenschappen van rijke baronnen die hebben gezegd: ‘Mijn geld gaat naar literatuur.’ Toen bestond internet nog niet. Dus voldoe je al voorbaat niet aan hun doelstellingen.” Baudet: “En ons project is heel vernieuwend.”

En met recht. De kamerleden zijn niet alleen gevraagd naar hun persoonlijke interesses, maar ook namens wie of welke groep zij in de kamer zitten, naar hun stemgedrag na belangrijke debatten en uiteraard in welke portefeuille(s) zij zich dagelijks hun tanden zetten. Aan de hand hiervan hebben Ruijter en Baudet profielen gemaakt en ingevoerd.

Als een bezoeker (burger) een of meerdere trefwoorden (‘criteria’) op de website invoert, maakt het programma een selectie van de meest aangewezen profielen. Aan de kamerleden kunnen vragen worden gesteld. Maar ook opmerkingen, suggesties en klachten kunnen worden doorgegeven.

Buurtveiligheid, leefbaarheid, ondernemerschap, kinderopvang, criminaliteit en noem maar op. Naast voor de hand liggende vraagstukken, kan je ook kijken bij wie je het beste kan aankloppen voor zaken die minder op voorgrond spelen, maar wel degelijk elke dag bepalend zijn voor welzijn en welvaart. Bijvoorbeeld speelruimte voor kinderen of toegankelijk maken van publieke ruimten voor gehandicapten. Het gaat om rechtstreekse communicatie, want aan de website zijn het emailadres van alle kamerleden gelinkt. Zowel vraag als antwoord wordt op de website gepubliceerd.

Het is vrijwel onmogelijk niet met de jongens mee te gaan in hun tomeloze enthousiasme over hun project. Daarbij bekruipt je het beschamende gevoel dat niemand veel eerder hierop gekomen is. Hét ei van Columbus. Het werd eigenlijk in februari 2005 gelegd, toen de site van de Tweede Kamer bekroond werd met de ‘Webflop 2005’, een initiatief van Burger@overheid en Tros Radar. ‘De Tweede Kamer benut de mogelijkheden van internet onvoldoende’, oordeelde de jury vernietigend. ‘Het parlement zou een voorbeeldfunctie moeten vervullen bij het digitaal overbruggen van de kloof tussen burgers en politiek, en daar is nu onvoldoende sprake van.’

En die kloof lijken Ruijter en Baudet te gaan dichten. Sterker: ze willen zelfs afrekenen met dat cliché. “Ik heb zo’n hekel aan die uitdrukking ‘dé kloof’”, zegt Ruijter. “Dat impliceert een afstand tussen mij en dat kamerlid. Dan stel ik me toch gewoon vóór.”

En dat is eigenlijk wat ze hebben gedaan. Als twee burgers afstappen op het parlement en min of meer zeggen: ‘Hallo, wij zijn Sander en Thierry, die door jullie worden vertegenwoordigd, wie zijn jullie?’ Ze behoefden helemaal geen bergbeklimmeruitrusting. Ruijter: “Hoe kan je in een leuk, plat landje als Nederland praten over een klóóf. Stel je gewoon voor en maak kennis.”

Het was een kennismaking vol verrassingen. Baudet: “We hebben zo vaak tegen elkaar gezegd: ‘Dit wordt een leuke site’. Mensen met bepaalde idealen, over wie ze vertegenwoordigen, of welk deel van de samenleving. Ik heb heel wat interessante mensen leren kennen, die elke dag namens ons belangrijke beslissingen nemen. En die mensen wil je leren kennen.”

Ruijter: “Je denkt aanvankelijk ook in vooroordelen omdat de lijsttrekker het imago van de partij bepaalt. We deden soms wel vijftien interviews op een dag en dan kwam je al die kleuren en levensverhalen tegen.”

Ook voor menig kamerlid was hun komst verlossend. “Eindelijk krijg ik de kans mezelf uit te spreken”, reageerde Madelaine van Toorenburg van het CDA opgelucht, alsof ze zich al die tijd een monddode parlementariër voelde.

Nou willen de jongens niet heel Nederland aansporen om even massaal persoonlijk kennis te gaan maken. “Nee, dan gaat KPN op z’n gat. Wij hebben het voor een ieder iets makkelijker gemaakt.”

Het idee is ook journalistiek ingegeven. Ruijter en Baudet waren presentatoren van het programma ‘Publieke zaak’ van Business News Radio (BNR), dat gaat over burgerinitiatief: niet klagen maar zelf het heft in handen nemen voor een leefbare omgeving. Al doende ontdekten zij dat heel wat kamerleden onbekend op de achterbanken zaten. Zo kwamen ze op het idee te beginnen met het item ‘back-benchers-profiel’. Baudet: “Een ultiem burgerinitiatief is jezelf vier jaar geven aan het land.”

Baudet studeerde rechten en geschiedenis en werkt nu als promovendus aan een proefschrift over de ‘grondbeginselen van ons rechtssysteem’. Ruijter studeerde economie en Japans en vertrekt binnenkort naar Londen, waar hij voor een bank gaat werken. Met hun multidisciplinaire instelling gingen ze aan de slag met hun project. “Niet in een koker blijven zitten. De onrust en ambities hebben over je eigen grenzen te kijken”, verwoordt Ruijter het.

Ze namen daarbij geen genoegen met clichés, zoals ‘mijn maatschappelijke betrokkenheid is met de paplepel ingegoten’ – ongeveer tien kamerleden zeiden dat. “Die is iedereen met de paplepel ingegoten”, stelt Ruijter. “Het zijn 150 mensen die keihard werken voor de samenleving. Zo kan je uit het VVD-nest komen en een goed onderwijsplan hebben ingediend. Sharon Gesthuizen van de SP verklaarde: ‘Ik zit hier in de kamer voor de kleine ondernemer. Ik heb zelf meegemaakt en weet hoe moeilijk ondernemen is.’” En zo is niet elke PVV’er gedreven door een te dikke koran. “Barry Madlener had een succesbedrijf, maar ging de politiek in omdat hij bepaalde idealen had ten tijde van Pim Fortuyn.”

De website is een ‘voorportaal’ waar je Tofik Dibi van GroenLinks, op vakantie, ziet zwemmen tussen de dolfijnen. “Dan zie je gewoon: dat is een gelukkig mens, een levensgenieter die mij vertegenwoordigt.” Het mag dan wel niet verrassen dat de flamboyante CDA’er en Hagenaar Jan Schinkelshoek van schaak houdt en geïnspireerd wordt door werken van De Tocqueville, des te verrassender is als hij verklapt: “Ik juich voor Sparta.” En PvdA’er Harm Evert-Waalkens zegt: “Ik heb mijn laarzen uitgeschopt, mij in het pak gehesen en mijn stropdas omgesjord. En nu zit ik hier als boer in Den-Haag.” Ruijter: “Dat soort mensen stemt je positief. Het gaat om échte mensen en niet om beroepspolitici die over een paar jaar minister willen zijn. Het overgrote deel van onze volksvertegenwoordiging is onbekend. Het probleem of idee ontdekken viel voor ons samen met het bedenken van een oplossing.” Een schoolvoorbeeld van burgerinitiatief.

Klik en ga terug naar Amsterdams Venster
           Prototype system goes after DNS-based botnets http://tco/y9SBVCEa        
2012-08-08 05:04:17 - securitypro2009 : Prototype system goes after DNS-based botnets http://tco/y9SBVCEa
          SharpDX, a new managed .Net DirectX API available        
If you have followed my previous work on a new .NET API for Direct3D 11,  I proposed SlimDX team this solution for the v2 of their framework, joined their team around one month ago, and I was actively working to widen the coverage of the DirectX API. I have been able to extend the API coverage almost up to the whole API, being able to develop Direct2D samples, as well as XAudio2 and XAPO samples using it. But due to some incompatible directions that the SlimDX team wanted to follow, I have decided to release also my work under a separate project called SharpDX. Now, you may wonder why I'm releasing this new API under a separate project from SlimDX?

Well, I have been working really hard on this from the beginning of September, and I explained why in my previous post about Direct3D 11. I have checked-in lots of code under the v2 branch on SlimDX, while having lots of discussion with the team (mostly Josh which is mostly responsible for v2) on their devel mailing list. The reason I'm leaving SlimDX team is that It was in fact not clear for me that I was not enrolled as part of the decision for the v2 directions, although  I was bringing a whole solution (by "whole", I mean a large proof of concept, not something robust, finished). At some point, Josh told me that Promit, Mike and himself, co-founders of SlimDX, were the technical leaders of this project and they would have the last word on the direction as well as for decisions on the v2 API.

Unfortunately, I was not expecting to work in such terms with them, considering that I had already made 100% of the whole engineering prototype for the next API. From the last few days, we had lots of -small- technical discussions, but for some of them, I clearly didn't agree about the decisions that were taken, whatever the arguments I was trying to give to them. This is a bit of disappointment for me, but well, that's life of open source projects. This is their project and they have other plans for it. So, I have decided to release the project on my own with SharpDX although you will see that the code is also currently exactly the same on the v2 branch of SlimDX (of course, because until yesterday, I was working on the SlimDX v2 branch).

But things are going to change for both projects : SlimDX is taking the robust way (for which I agree) but with some decisions that I don't agree (in terms of implementation and direction). Although, as It may sound weird, SharpDX is not intended to compete with SlimDX v2 : They have clearly a different scope (supporting for example Direct3D 9, which I don't really care in fact), different target and also different view on exposing the API and a large existing community already on SlimDX. So SharpDX is primarily  intended for my own work on demomaking. Nothing more. I'm releasing it, because SlimDX v2 is not going to be available soon, even for an alpha version. On my side, I'm considering that the current state (although far to be as clean as It should be) of the SharpDX API is usable and I'm going to use it on my own, while improving the generator and parser, to make the code safer and more robust.

So, I did lots of work to bring new API into this system, including :
  • Direct3D 10
  • Direct3D 10.1
  • Direct3D 11
  • Direct2D 1
  • DirectWrite
  • DXGI
  • DXGI 1.1
  • D3DCompiler
  • DirectSound
  • XAudio2
  • XAPO
And I have been working also on some nice samples, for example using Direct2D and Direct3D 10, including the usage of the tessellate Direct2D API, in order to see how well It works compared to the gluTessellation methods that are most commonly used. You will find that the code is extremely simple in SharpDX to do such a thing :
using System;
using System.Drawing;
using SharpDX.Direct2D1;
using SharpDX.Samples;

namespace TessellateApp
/// Direct2D1 Tessellate Demo.

public class Program : Direct2D1DemoApp, TessellationSink
EllipseGeometry Ellipse { get; set; }
PathGeometry TesselatedGeometry{ get; set; }
GeometrySink GeometrySink { get; set; }

protected override void Initialize(DemoConfiguration demoConfiguration)

// Create an ellipse
Ellipse = new EllipseGeometry(Factory2D,
new Ellipse(new PointF(demoConfiguration.Width/2, demoConfiguration.Height/2), demoConfiguration.Width/2 - 100,
demoConfiguration.Height/2 - 100));

// Populate a PathGeometry from Ellipse tessellation
TesselatedGeometry = new PathGeometry(Factory2D);
GeometrySink = TesselatedGeometry.Open();
// Force RoundLineJoin otherwise the tesselated looks buggy at line joins

// Tesselate the ellipse to our TessellationSink
Ellipse.Tessellate(1, this);

// Close the GeometrySink

protected override void Draw(DemoTime time)

// Draw the TextLayout
RenderTarget2D.DrawGeometry(TesselatedGeometry, SceneColorBrush, 1, null);

void TessellationSink.AddTriangles(Triangle[] triangles)
// Add Tessellated triangles to the opened GeometrySink
foreach (var triangle in triangles)
GeometrySink.BeginFigure(triangle.Point1, FigureBegin.Filled);

void TessellationSink.Close()

static void Main(string[] args)
Program program = new Program();
program.Run(new DemoConfiguration("SharpDX Direct2D1 Tessellate Demo"));

This simple example is producing the following ouput :

which is pretty cool, considering the amount of code (although the Direct3D 10 and D2D initialization part would give a larger code), I found this to be much simpler than the gluTessellation API.

You will find also some other samples, like the XAudio2 ones, generating a synthesized sound with the usage of the reverb, and even some custom XAPO sound processors!

You can grab those samples on SharpDX code repository (there is a with a working solutions with all the samples I have been developing so far, with also MiniTris sample from SlimDX).
          A new managed .NET/C# Direct3D 11 API generated from DirectX SDK headers        
I have been quite busy since the end of august, personally because I'm proud to announce the birth of my daughter! (and his older brother, is somewhat, asking a lot more attention since ;) ) and also, working hard on an exciting new project based on .NET and Direct3D.

What is it? Yet Another Triangle App? Nope, this is in fact an entirely new .NET API for Direct3D11, DXGI, D3DCompiler that is fully managed without using any mixed assemblies C++/CLI but having similar performance than a true C++/CLI API (like SlimDX). But the main characteristics and most exciting thing about this new wrapper is that the whole code marshal/interop is fully generated from the DirectX SDK headers, including the MSDN documentation.

The current key features and benefits of this approach are:

  • API is generated from DirectX SDK headers : the mapping is able to perform "complex transformation", extracting all relevant information like enumerations, structures, interfaces, functions, macro definitions, guids from the C++ source headers. For example, the mapping process is able to generated properties for interfaces or inner group interface like the one you have in SlimDX : meaning that instead of having a "device.IASetInputLayout" you are able to write "device.InputAssembler.InputLayout = ...".
  • Full support of Direct3D 11, DXGI 1.0/1.1, D3DCompiler API : Due to the whole auto-generated process, the actual coverage is 100%. Although, I have limited the generated code to those library but that could be extended to others API quite easily (like XAudio2, Direct2D, DirectWrite... etc.).
  • Pure managed .NET API : assemblies are compiled with AnyCpu target. You can run your code on a x64 or a x86 machine with the same assemblies. 
  • API Extensibility The generated code is in C#, all the types are marked "partial" and are easily extensible to provide new helpers method. The code generator is able to hide some methods/types internally in order to use them in helper methods and to hide them from the public api.
  • C++/CLI Speed : the framework is using a genuine way to avoid any C++/CLI while still achieving comparable performance.
  • Separate assemblies : a core assembly containing common classes and an assembly for each subgroup API (Direct3D, DXGI, D3DCompiler)
  • Lightweight assemblies : generated assemblies are lightweight, 300Ko in total, 70Ko compressed in an archive (similar assemblies in C++/CLI would be closer to 1Mo, one for each architecture, and depend from MSVCRT10)
  • API naming convention very close to SlimDX API (To make it 100% equals would just require to specify the correct mapping names while generating the code)
  • Raw DirectX object life management : No overhead of ObjectTable or RCW mechanism, the API is using direct native management with classic COM method "Release". Currently, instead of calling Dispose, you should call Release (and call AddRef if you are duplicating references, like in C++). I might evaluate how to safely integrate Dispose method call. 
  • Easily obfuscatable : Due to the fact the framework is not using any mixed assemblies
  • DirectX SDK Documentation integrated in the .NET xml comments : The whole API is also generated with the MSDN documentation. Meaning that you have exactly the same documentation for DirectX and for this API (this is working even for method parameters, remarks, enum items...etc.). Reference to other types inside the documentation are correctly linked to the .NET API. 
  • Prototype for a partial support of the Effects11 API in full managed .NET.
If you have been working with SlimDX, some of the features here could sound familiar and you may wonder why another .DirectX NET API while there is a great project like SlimDX? Before going further in the detail of this wrapper and how things are working in the background, I'm going to explain why this wrapper could be interesting.

I'm also currently not in the position to release it for the reason that I don't want to compete with SlimDX. I want to see if SlimDX Team would be interested to work together with this system, a kind of joint-venture. There are still lots of things to do, improving the mapping, making it more reliable (the whole code here has been written in a urge since one month...) but I strongly believe that this could be a good starting point to SlimDX 2, but I might be wrong... also, SlimDX could think about another road map... So this is a message to the SlimDX Team : Promit, Josh, Mike, I would be glad to hear some comments from you about this wrapper (and if you want, I could send you the generated API so that you could look at it and test it!)

[Updated 30 November 2010]
This wrapper is now available from SharpDX. Check this post.

This post is going to be quite long, so if you are not interested by all the internals, you could jump to the sample code at the end.

An attempt to a SlimDX next gen

First of all, is it related to 4k or 64k intros? (an usual question here, mostly question for myself :D) Well, while I'm still working to make things smaller, even in .NET, I would like to work on a demo based on .NET (but with lots of procedurally generated textures and music).  I have been evaluating both XNA and SlimDX, and in September, I have even been working on a XNA like API other SlimDX / Direct3D 11 that was working great, simplifiying a lot the code, while still having benefits to use new D3D11 API (Geometry shaders, Compute Shaders...etc.). I will talk later about this "Demo" layer API.

As a demo maker for tiny executable, even in .NET, I found that working with SlimDX was not the best option : even stripping the code, recompiling the SlimDX to keep only DirectX11/DXGI&co, I had a roughly 1Mo dll (one for each architecture) + a dependency to MSVRT10 which is a bit annoying. Even if I would like to work on a demo (with less size constraint), I didn't want to have a 100Ko exe and a 1Mo compressed of external dlls...

Also, I read some of Josh's thoughts about SlimDX 2 : I was convinced about the need for separated assemblies and simplified life object management. But was not convinced by the need to use "interfaces" for the new API and not really happy about still having some platform specific mixed-assemblies in order to support correctly 32/64 bit architecture (with a simple delay loading).

What is supposed to address SlimDX 2 over SlimDX?
  • Making object life management closer to the real thing (no Dispose but raw Release instead) 
  • Multiple assemblies
  • Working on the API more with C# than in C++/CLI
  • Support automatic platform architecture switching (running transparently an executable on a x86 and x64 machine without recompiling anything).
Recall that I was slightly working around August on parsing the SDK headers based on Boost::Wave V2.0. My concern was that I have developed a SlimDX like interface in C++ for Ergon demo, but I found the process to be very laborious, although very straightforward, while staying in the same language as DirectX... Thinking more about it, and because I wanted to do more work in 3D and C# (damn it, this language is SOOO cool and powerful compared to C++)... I found that It would be a great opportunity to see if it's not possible to extract enough information from the SDK headers in order to generate a Direct3D 11 .NET/C# API.

And everything has been surprisingly very fast : extraction of all the code information from the SDK C++ headers file was in fact quite easy to code, in few days... and generating the code was quite easy (I have to admit that I have a strong experience in this kind of process, and did similar work, around ten years ago, in Java, delivering an innovative Java/COM bridge layer for the company I was working at that time, much safer than Sun Java/COM layer that was buggy and much more powerfull, supporting early binding, inheritance, documentation... etc).

In fact, with this generating process, I have been able to address almost all the issue that were expected to  be solved in SlimDX 2, and moreover, It's going a bit further because the process is automated and It's supporting the platform x86/x64 without requiring any mixed assemblies.

In the following sections, I'm going to deeply explain the architecture, features, internals and mapping rules used to generate this new .Net wrapper (which has currently the "SharpDX" code name).


In order to generate Managed .NET API for DirectX from the SDK headers, the process is composed of 3 main steps:
  1. Convert from the DirectX SDK C++ Headers to an intermediate format called "XIDL" which is a mix of XML and "IDL". This first part is responsible to reverse engineering the headers, extract back all existing and useful information (more on the following section), and produce a kind of IDL (Intermediate Definition Language). In fact, If I had access to the IDL used internally at Microsoft, It wouldn't have been necessary to write this whole part, but sadly, the DirectX 11 IDL is not available, although you can clearly verify from the D3D11.h that this file is generated from an IDL. This module is also responsible to access MSDN website and crawl the needed documentation, and associate it with all the languages elements (structures, structures fields, enums, enum items, interfaces, interfaces methods, method parameters...etc.). Once a documentation has been retrieved, It's stored on the disk and is not retrieved next time the conversion process is re-runned.
  2. Convert from the XIDL file to several C# files. This part is responsible to perform from a set of mapping rules a translation of C++ definition to C# definition. The mapping is as complex as identifying which include would map to assembly/namespace, which type could be moved to an assembly/namespace, how to rename the types,functions, fields, parameters, how to add missing information from the XIDL file...etc. The current mapping rules are express in less then 600 lines of C# code... There is also a trick here not described in the picture. This process is also generating a small interop assembly which is only used at compile time, dynamically generated at runtime and responsible for filling the gap between what is possible in C# and what you can do in C++/CLI (there are lots of small usefull IL bytecode instructions generated in C++/CLI that are not accessible from C#, this assembly is here for that....more on this in the Convert to XIDL section).
  3. Integrate the generated files in several Visual Studio projects and a global solution. Each project is generating an assembly. It is where you can add custom code that could not be generated (like Vector3 math functions, or general framework objects like a ComObject). The generated code is also fully marked with "partial" class, one of the cool things of C# : you can have multiple files contributing to the same class declaration... making things easy to have generated code on the side of custom hand made code. 

Revert DirectX IDL from headers

Unfortunately, I have not found a workable C preprocessor written in .NET, and this part has been a bit laborious to make it work. The good thing is that I have found Boost Wave 2.0 in C++. The bad thing is that this library, written in a heavy boost-STL-templatizer philosophy was really hard to manage to work under a C++/CLI DLL. Well, the principle was to embed Boost Wave in a managed DLL, in order to use it from C#... after several attempts, I was not able to build it with C++/CLI .NET 4.0. So I ended up in a small dll COM wrapper around BoostWave, and a thin wrapper in .NET calling this dll. Compiling Boost-Wave was also sometimes a nightmare : I tried to implement my own provider of stream for Wave... but dealing with a linker error that was freezing VS2010 for 5s to display the error (several Ko of a single template cascaded error)... I have found somewhere on the Wave release that It was in fact not supported... but wow, templates are supposed to make life easier... but the way It is used gives a really bad feeling... (and I'm not a beginner in C++ template...)

Anyway, after succeeding to wrap BoostWave API, I had a bunch of tokens to process. I started to wrote a handwritten C/C++ parser, which is targeted to read well-formed DirectX headers and nothing else. It was quite tricky sometimes, the code is far from being failsafe, but I succeed to parse correctly most of the DirectX headers. During the mapping to C#, I was able to find a couple of errors in the parser that were easy to fix.

In the end, this parser is able to extract from the headers:
  • Enumerations, Structures, Interfaces, Functions, Typedefs
  • Macros definitions
  • GUIDs
  • Include dependency
The whole data is stored in a C# model that is marshaled in XML using WCF (DataMember, DataContract), which make the code really easy to write, not much intrusive and you can serialize and deserialize to XML. For example, a CppType is defined like this:

using System.Runtime.Serialization;
using System.Text;

namespace SharpDX.Tools.XIDL
public class CppType : CppElement
public string Type { get; set;}
public string Specifier { get; set; }
public bool Const { get; set; }
[DataMember(Order = 3)]
public bool IsArray { get; set; }
public string ArrayDimension { get; set; }

The model is really lightweight, no fancy methods and easy to navigate in.

The process is also responsible to get documentation for each C++ items (enumerations, structures, interfaces, functions). The documentation is requested to MSDN while generating all the types. That was also a bit tricky to parse, but in the end, the class is very small (less than 200 lines of C# code). Downloaded documentation is stored on the disk and is used for later re-generation of the parsing.

The generated XML model is taking around 1.7Mo for DXGI, D3D11, D3DX11, D3DCompiler includes and looks like this:

          <Description>A device-child interface accesses data used by a device.Description>
          <Remarks i:nil="true" />
              <Description>Get a pointer to the device that created this interface.Description>
              <Remarks>Any returned interfaces will have their reference count incremented by one, so be sure to call ::release() on the returned pointer(s) before they are freed or else you will have a memory leak.Remarks>
                <Name i:nil="true" />
                <Description>voidReturns nothing.Description>
                <Remarks i:nil="true" />
                <ArrayDimension i:nil="true" />
                  <Description>Address of a pointer to a device (see {{ID3D11Device}}).Description>
                  <Remarks i:nil="true" />
                  <ArrayDimension i:nil="true" />

One of the most important thing in the DirectX headers that are required to develop a reliable code generator is the presence of C+ windows specific attributes : all the methods are prefix by macros __out __in __out_opt , __out_buffer... etc. All those attributes are similar to C# attributes and are explaining how to interpret the parameter. If you take the previous code, there is a method GetDevice that is returning a ID3D11Device through a [out] parameter. The [Out] parameter is extremely important here, as we know exactly how to use it. Same thing when you have a pointer which is in fact a buffer : with the attributes, you know that this is an array of elements behind the pointer...

Although, I have discovered that some functions/methods sometimes are lacking some attributes.... but hopefully, the next process (the mapping from XIDL to C#) is able to add missing information like this.

As I said, the current implementation is far from being failsafe and would probably require more testing on other headers files. At least, the process is correctly working on a subset of the DirectX headers.

Generate C# from IDL

This part of the process has been a lot more time consuming. I started with enums, which were quite straightforward to manage. Structures were asking a bit more work, as there is some need for some custom marshalling for some structures that cannot marshal easily... Then interfaces methods were the most difficult part, correctly handling all parameters case was not easy...

The process of generating the C# code is done in 3 steps:
  1. Reading XIDL model and prepare the model for mapping: remove types, add information to some methods. 
  2. Generate a C# model with the XIDL model and a set of mapping rules
  3. Generate C# files from the C# model. I have used T4 "Text Template Transformation Toolkit" engine as a text templatizer, which is part of VS2010 and is really easy to use, integrated in VS2010 with a third party syntax highlighting plugin. 
This step is also responsible to generate an interop assembly which is emiting directly some .NET IL bytecodes through the System.Reflection.Emit. This interop assembly is the trick to avoid the usage of a C++/CLI mixed assembly

Preamble) How to avoid the usage of C++/CLI in C#

If you look at some generated C++/CLI code with Reflector, you will see that most of the code is in fact a pure IL bytecode, even when there is a call to a native function or native methods...

The trick here is that there are a couple of IL instructions that are used internally by C# but not exposed to the language.

1) The instruction "calli"

This instruction is responsible to call directly an unmanaged function, without going through the pinvoke/interop  layer (in fact, pinvoke is calling in the end "calli", but is performing a much more complex marshaling of the parameters, structures...)

What I need was a way to call an umanaged function/methods without going through the pinvoke layer, and "calli" is exactly here for this. Now, suppose that we could generate a small assembly at compile time and at runtime that would be responsible for handling those calli function, we would not have to use anymore C++/CLI for this.

For example, suppose that I want to call a C++ method of an interface which takes an integer as a parameter, something like :
interface IDevice : IUnknown {
void Draw(int count);
I only need a function in C# that is able to directly call this method, without going the pinvoke layer, with a pointer to the C++ IDevice object and the offset of the method in the vtbl (offset will be expressed in bytes, for a x86 architecture here) :
class Interop {
public static unsafe void CalliVoid(void* thisObject, int vtblOffset, int arg0);

// A call to IDevice
void* ptrToIDevice = ...;

// A Call to the method Draw, number 3 in the vtbl order (starting at 0 to 2 for IUnknown methods)
Interop.CalliVoid(ptrToIDevice, /* 3 * sizeof(void* in x86) */ 3 * 4 , /* count */4 );

The IL bytecode content of this method for a x64 architecture would be typically in C++/CLI like this:
.method public hidebysig static void CalliVoid(void* arg0, int32 arg1, int32 arg2) cil managed
.maxstack 4
L_0000: ldarg.0 // Load (0) this arg (1st parameter for native method)
L_0001: ldarg.2 // Load (1) count arg
L_0002: ldarg.1 // Offset in vtbl
L_0003: conv.i // Convert to native int
L_0004: dup //
L_0005: add // Offset = offset * 2 (only for x64 architecture)
L_0006: ldarg.0 //
L_0007: ldind.i // Load vtbl poointer
L_0008: add // pVtbl = pVtbl + offset
L_0009: ldind.i // load function from the vtbl fointer
L_000a: calli method unmanaged stdcall void *(void*, int32)
L_000f: ret

This kind of code will be automatically inlined by the JIT (which is, from SCCLI/Rotor sourcecode, inlining functions that are taking less than 25 bytes of bytecode).

If you look at a C++/CLI assembly, you will see lots of "calli" instructions.

So in the end, how this trick is used? Because the generator knows all the methods from all the interfaces, it is able to generate a set of all possible calling conventions to unmanaged object. In fact, the XIDLToCSharp generator is responsible to generate an assembly containing all the interop methods (around 66 methods using Calli) :
public class Interop
private Interop();
public static unsafe float CalliFloat(void* arg0, int arg1, void* arg2);
public static unsafe int CalliInt(void* arg0, int arg1);
public static unsafe int CalliInt(void* arg0, int arg1, int arg2);
public static unsafe int CalliInt(void* arg0, int arg1, void* arg2);
public static unsafe int CalliInt(void* arg0, int arg1, long arg2);
public static unsafe int CalliInt(void* arg0, int arg1, int arg2, int arg3);
public static unsafe int CalliInt(void* arg0, int arg1, long arg2, int arg3);
public static unsafe int CalliInt(void* arg0, int arg1, void* arg2, int arg3);
public static unsafe int CalliInt(void* arg0, int arg1, void* arg2, void* arg3);
public static unsafe int CalliInt(void* arg0, int arg1, int arg2, void* arg3);
public static unsafe int CalliInt(void* arg0, int arg1, IntPtr arg2, void* arg3);
public static unsafe int CalliInt(void* arg0, int arg1, IntPtr arg2, int arg3);
public static unsafe int CalliInt(void* arg0, int arg1, int arg2, void* arg3, int arg4);
public static unsafe int CalliInt(void* arg0, int arg1, int arg2, void* arg3, void* arg4);
public static unsafe int CalliInt(void* arg0, int arg1, void* arg2, int arg3, void* arg4);
public static unsafe int CalliInt(void* arg0, int arg1, int arg2, int arg3, void* arg4);
public static unsafe int CalliInt(void* arg0, int arg1, void* arg2, void* arg3, void* arg4);
public static unsafe int CalliInt(void* arg0, int arg1, IntPtr arg2, void* arg3, void* arg4);
public static unsafe int CalliInt(void* arg0, int arg1, void* arg2, void* arg3, int arg4);
public static unsafe int CalliInt(void* arg0, int arg1, int arg2, int arg3, void* arg4, void* arg5);
public static unsafe int CalliInt(void* arg0, int arg1, void* arg2, void* arg3, int arg4, int arg5);
// ...[stripping Calli x methods here]...
public static unsafe void CalliVoid(void* arg0, int arg1, int arg2, void* arg3, void* arg4, int arg5, int arg6, void* arg7);
public static unsafe void CalliVoid(void* arg0, int arg1, void* arg2, float arg3, float arg4, float arg5, float arg6, void* arg7);
public static unsafe void CalliVoid(void* arg0, int arg1, int arg2, void* arg3, void* arg4, int arg5, int arg6, void* arg7, void* arg8);
public static unsafe void CalliVoid(void* arg0, int arg1, void* arg2, int arg3, int arg4, int arg5, int arg6, void* arg7, int arg8, void* arg9);
public static unsafe void* Read<T>(void* pSrc, ref T data) where T: struct;
public static unsafe void* Read<T>(void* pSrc, T[] data, int offset, int count) where T: struct;
public static unsafe void* Write<T>(void* pDest, ref T data) where T: struct;
public static unsafe void* Write<T>(void* pDest, T[] data, int offset, int count) where T: struct;
public static void memcpy(void* pDest, void* pSrc, int Count);

This assembly is used at compile time but is not distributed at runtime. Instead, this assembly is dynamically generated at runtime in order to support difference in bytecode between x86 and x64 (in the calli example, we need to multiply by 2 the offset into the vtbl table, because the sizeof of a pointer in x64 is 8 bytes).

2) The instruction "sizeof" for generic

Although the Calli is the real trick that makes it possible to have a managed way to call unmanaged method without using pinvoke, I have found a couple of other IL bytecode that is necessary to have the same features than in C++/CLI.

The other one is sizeof for generic. In C#, we know that there is a sizeof, but while trying to replicate the DataStream class from SlimDX in pure C#, I was not able to write this kind code :
public class DataStream
// Unmarshal a struct from a memory location
public T Read<T>() where T: struct {
T myStruct = default(T);
memcpy(&mystruct, &m_buffer, sizeof(T));
return myStruct;

In fact, under C#, the sizeof is not working for a generic, even if you specify that the generic is a struct. Because C# cannot constraint the struct to contains only blittable fields (I mean, It could, but It doesn't try to do it), they don't allow to take the size of a generic struct... that was annoying, but because with pure IL instruction, It's working well and I was already generating the Interop assembly, I was free to add whatever methods with custom bytecode to fill the gap...

In the end, the interop code to read a generic struct from a memory location looks like this :
// This method is reading a T struct from pSrc and returning the address : pSrc + sizeof(T)
.method public hidebysig static void* Read<valuetype .ctor T>(void* pSrc, !!T& data) cil managed
.maxstack 3
.locals init (
[0] int32 num,
[1] !!T* pinned localPtr)
L_0000: ldarg.1
L_0001: stloc.1
L_0002: ldloc.1
L_0003: ldarg.0
L_0004: sizeof !!T
L_000a: conv.i4
L_000b: stloc.0
L_000c: ldloc.0
L_000d: unaligned 1 // Mandatory for x64 architecture
L_0010: nop
L_0011: nop
L_0012: nop
L_0013: cpblk // Memcpy
L_0015: ldloc.0
L_0016: conv.i
L_0017: ldarg.0
L_0018: add
L_0019: ret

3) The instruction "cpblk", memcpy in IL

In the previous function, you can see the use of "cpblk" bytecode instruction. In fact, when you are looking at a C++/CLI method using a memcpy, It will not use the memcpy from the C CRT but directly the IL instruction performing the same task. This IL instruction is faster than using anykind of interop, so I made it available to C# through the Interop assembly

I) Prepare XIDL model for mapping

So the 1st step in the XIDLToCSharp process is to prepare the XIDL model to be more mapping friendly. This step is essentially responsible to:
  • Add missing C++ attributes (In, InOut, Buffer) information to some method's parameter
  • Replace the type of some method parameters : for example in DirectX, there are lots of parameter that are taking a flags, which is in fact an already declared enum... but for some unknown reason, they are declaring the method with an "int" instead of using the enum...
  • Remove some types. For example,  the D3D_PRIMITIVE_TOPOLOGY is holding a bunch of D3D11 and D3D10 enum, duplicating D3D_PRIMITIVE enums... So I'm removing them.
  • Add some tag directly on the XIDL model in order to ease the next mapping process : those tags are for example used for tagging the C# visibility of the method, or forcing a method to not be interpreted  as a "property")
// Read the XIDL model
CppIncludeGroup group = CppIncludeGroup.Read("directx_idl.xml");

group.Modify<CppParameter>("^D3DX11.*?::pDefines", Modifiers.ParameterAttribute(CppAttribute.In | CppAttribute.Buffer | CppAttribute.Optional));

// Modify device Flags for D3D11CreateDevice to use D3D11_CREATE_DEVICE_FLAG
group.Modify<CppParameter>("^D3D11CreateDevice.*?::Flags$", Modifiers.Type("D3D11_CREATE_DEVICE_FLAG"));

// ppFactory on CreateDXGIFactory.* should be Attribute.Out
group.Modify<CppParameter>("^CreateDXGIFactory.*?::ppFactory$", Modifiers.ParameterAttribute(CppAttribute.Out));

// pDefines is an array of Macro (and not just In)
group.Modify<CppParameter>("^D3DCompile::pDefines", Modifiers.ParameterAttribute(CppAttribute.In | CppAttribute.Buffer | CppAttribute.Optional));
group.Modify<CppParameter>("^D3DPreprocess::pDefines", Modifiers.ParameterAttribute(CppAttribute.In | CppAttribute.Buffer | CppAttribute.Optional));

// SwapChain description is mandatory In and not optional
group.Modify<CppParameter>("^D3D11CreateDeviceAndSwapChain::pSwapChainDesc", Modifiers.ParameterAttribute(CppAttribute.In));

// Remove all enums ending with _FORCE_DWORD, FORCE_UINT
group.Modify<CppEnumItem>("^.*_FORCE_DWORD$", Modifiers.Remove);
group.Modify<CppEnumItem>("^.*_FORCE_UINT$", Modifiers.Remove);

You can see that the pre-mapping (and the mapping) is using intensively regular expression for matching names, which is a very convenient way to perform some kind of XPATH request with Regex expressions.

II) Generate C# model from XIDL and mapping rules

This process is taking the pre-process XIDL and is generating a C# model (a subset of the C# model in memory), adding mapping information and preparing things to make it easier to use it from the T4 templatizer engine.

In order to generate the C# model from DirectX, the generator needs a couple of mapping rules.

1) Mapping an include to an assembly / namespace

This rules is defining a default dispatching of types to assembly / namespace. It will associate source headers include (the name of the .h, without the extension).
// Namespace mapping 

// Map dxgi include to assembly SharpDX.DXGI, namespace SharpDX.DXGI
gen.MapIncludeToNamespace("dxgi", "SharpDX.DXGI");
gen.MapIncludeToNamespace("dxgiformat", "SharpDX.DXGI");
gen.MapIncludeToNamespace("dxgitype", "SharpDX.DXGI");

// Map D3DCommon include to assembly SharpDX, namespace SharpDX.Direct3D
gen.MapIncludeToNamespace("d3dcommon", "SharpDX.Direct3D", "SharpDX");

gen.MapIncludeToNamespace("d3d11", "SharpDX.Direct3D11");
gen.MapIncludeToNamespace("d3dx11", "SharpDX.Direct3D11");
gen.MapIncludeToNamespace("d3dx11core", "SharpDX.Direct3D11");
gen.MapIncludeToNamespace("d3dx11tex", "SharpDX.Direct3D11");
gen.MapIncludeToNamespace("d3dx11async", "SharpDX.Direct3D11");
gen.MapIncludeToNamespace("d3d11shader", "SharpDX.D3DCompiler");
gen.MapIncludeToNamespace("d3dcompiler", "SharpDX.D3DCompiler");

2) Mapping a particular type to an assembly / namespace

It is also necessary to override the default include to assembly/namespace dispatching for some particular types. This rules is doing this.
gen.MapTypeToNamespace("^D3D_PRIMITIVE$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_CBUFFER_TYPE$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_RESOURCE_RETURN_TYPE$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_SHADER_CBUFFER_FLAGS$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_SHADER_INPUT_TYPE$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_SHADER_VARIABLE_CLASS$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_SHADER_VARIABLE_FLAG$S", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_SHADER_VARIABLE_TYPE$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_TESSELLATOR_DOMAIN$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_TESSELLATOR_PARTITIONING$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_TESSELLATOR_OUTPUT_PRIMITIVE$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_SHADER_INPUT_FLAGS$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_NAME$", "SharpDX.D3DCompiler");
gen.MapTypeToNamespace("^D3D_REGISTER_COMPONENT_TYPE$", "SharpDX.D3DCompiler");

The previous code is instructing the generator to move some D3D types to the SharpDX.D3DCompiler namespace (and assembly). Those types are in fact more related to Shader reflection and are associated with the D3DCompiler assembly (I took the same design choice from SlimDX, although we could think about another mapping).

3) Mapping a C++ type to a custom C# type

It is sometimes necessary to map a C++ type to a non generated C# type. For example, there is the C++ "RECT" structure which is not stritcly equivalent to the System.Drawing.Rectangle (the RECT struct is using the Left,Top,Right,Bottom fields instead of Left,Top,Width,Height for System.Drawing.Rectangle). This mapping is able to define a custom mapping. The SharpDX.Rectangle is not generated by the generator but is defined in the SharpDX assembly project (last part).
var rectType = new CSharpStruct();
rectType.Name = "SharpDX.Rectangle";
rectType.SizeOf = 4*4;
gen.MapCppTypeToCSharpType("RECT", rectType); //"SharpDX.Rectangle", 4 * 4, false, true);

4) Mapping a C++ name to a C# name
The renaming rules are quite rich. The XIDLToCSharp provides a default renaming mechanism that respect the CamelCase convention, but there are some exceptions that need to be addressed. For example:
// Rename DXGI_MODE_ROTATION to DisplayModeRotation
gen.RenameType(@"^DXGI_MODE_SCALING$", "DisplayModeScaling");
gen.RenameType(@"^DXGI_MODE_SCANLINE_ORDER$", "DisplayModeScanlineOrder");

// Use regular expression to take the part of some names...
gen.RenameType(@"^D3D_SVC_(.*)", "$1");
gen.RenameType(@"^D3D_SVF_(.*)", "$1");
gen.RenameType(@"^D3D_SVT_(.*)", "$1");
gen.RenameType(@"^D3D_SIF_(.*)", "$1");
gen.RenameType(@"^D3D_SIT_(.*)", "$1");
gen.RenameType(@"^D3D_CT_(.*)", "$1");

For structures and enums that are using the "_" underscore to separate name subpart, you can let XIDLToCSharp rename correctly each subpart, while still being able to specify how a subpart can be rename:
// Expand sub part between underscore
gen.RenameTypePart("^DESC$", "Description");
gen.RenameTypePart("^CBUFFER$", "ConstantBuffer");
gen.RenameTypePart("^TBUFFER$", "TextureBuffer");
gen.RenameTypePart("^BUFFEREX$", "ExtendedBuffer");
gen.RenameTypePart("^FUNC$", "Function");
gen.RenameTypePart("^FLAG$", "Flags");
gen.RenameTypePart("^SRV$", "ShaderResourceView");
gen.RenameTypePart("^DSV$", "DepthStencilView");
gen.RenameTypePart("^RTV$", "RenderTargetView");
gen.RenameTypePart("^UAV$", "UnorderedAccessView");
gen.RenameTypePart("^TEXTURE1D$", "Texture1D");
gen.RenameTypePart("^TEXTURE2D$", "Texture2D");
gen.RenameTypePart("^TEXTURE3D$", "Texture3D");

With this rules, for example with a struct named as "BLABLA_DESC", the DESC part will be expand to "Description", resulting in the C# name "BlablaDescription".

5) Change Field type mapping in C#

Again, there are lots of enums in DirectX that are not used in the structures. For example, if you take the D3D11_BUFFER_DESC, all enums are declared as int instead of using their respective enums.

This mapping rules is responsible to change the destination type for a field:
gen.ChangeStructFieldTypeToNative("D3D11_BUFFER_DESC", "BindFlags", "D3D11_BIND_FLAG");
gen.ChangeStructFieldTypeToNative("D3D11_BUFFER_DESC", "CPUAccessFlags", "D3D11_CPU_ACCESS_FLAG");
gen.ChangeStructFieldTypeToNative("D3D11_BUFFER_DESC", "MiscFlags", "D3D11_RESOURCE_MISC_FLAG");

6) Generate enums from C++ macros, improving enums

Again, DirectX SDK is not consistent with enums. Sometimes there are some enums that are in fact defined with some macro definition, which makes intellisense experience inexistent...

XIDLToCSharp is able to create an enum from a set of macros definitions
// Create enums from macro definitions
// Create the D3DCOMPILE_SHADER_FLAGS C++ type from the D3DCOMPILE_.* macros
gen.CreateEnumFromMacros(@"^D3DCOMPILE_[^E][^F].*", "D3DCOMPILE_SHADER_FLAGS");
gen.CreateEnumFromMacros(@"^D3DCOMPILE_EFFECT_.*", "D3DCOMPILE_EFFECT_FLAGS");
gen.CreateEnumFromMacros(@"^D3D_DISASM_.*", "D3DCOMPILE_DISASM_FLAGS");

There are also some tiny things to adjust to existing enums, like adding a "None=0" enum item for some flags.

7) Move interface methods to inner interfaces in C#

If you have been using Direct3D 11, you have notice that all methods for each stages are prefix with the stage abbreviation, making for example the ID3D11DeviceContext interface quite ugly to use, ending in some code like this:

SlimDX did something really nice : they have created for each pipeline stage (IA for InputAssembler, VS for VertexShader) a property accessor to an interface that is exposing the method of this stage, resulting in an improved readability and a much better intellisense experience.
deviceContext.InputAssembler.InputLayout = inputlayout; 

In the XIDL2CSharp, there is a rules to handle such a case, and is simple as writing this:
// Map all IA* methods to the internal interface InputAssemblerStage with the acessor property InputAssembler, using the method name $1 (extract from the regexp)
gen.MoveMethodsToInnerInterface("ID3D11DeviceContext::IA(.*)", "InputAssemblerStage", "InputAssembler", "$1");
gen.MoveMethodsToInnerInterface("ID3D11DeviceContext::VS(.*)", "VertexShaderStage", "VertexShader", "$1");
gen.MoveMethodsToInnerInterface("ID3D11DeviceContext::PS(.*)", "PixelShaderStage", "PixelShader", "$1");
gen.MoveMethodsToInnerInterface("ID3D11DeviceContext::GS(.*)", "GeometryShaderStage", "GeometryShader", "$1");
gen.MoveMethodsToInnerInterface("ID3D11DeviceContext::SO(.*)", "StreamOutputStage", "StreamOutput", "$1");
gen.MoveMethodsToInnerInterface("ID3D11DeviceContext::DS(.*)", "DomainShaderStage", "DomainShader", "$1");
gen.MoveMethodsToInnerInterface("ID3D11DeviceContext::HS(.*)", "HullShaderStage", "HullShader", "$1");
gen.MoveMethodsToInnerInterface("ID3D11DeviceContext::RS(.*)", "RasterizerStage", "Rasterizer", "$1");
gen.MoveMethodsToInnerInterface("ID3D11DeviceContext::OM(.*)", "OutputMergerStage", "OutputMerger", "$1");
gen.MoveMethodsToInnerInterface("ID3D11DeviceContext::CS(.*)", "ComputeShaderStage", "ComputeShader", "$1");

8) Dispatch method to function group

DirectX C++ functions are mapped to a set of function group and an associated DLL. For example, it is possible to specify that all D3D11.* methods will map to a class D3D11 containing all the associated methods.
// Function group
var d3dCommonFunctionGroup = gen.CreateFunctionGroup("SharpDX", "SharpDX.Direct3D", "D3DCommon");
var dxgiFunctionGroup = gen.CreateFunctionGroup("SharpDX.DXGI", "SharpDX.DXGI", "DXGI");
var d3dFunctionGroup = gen.CreateFunctionGroup("SharpDX.D3DCompiler", "SharpDX.D3DCompiler", "D3D");
var d3d11FunctionGroup = gen.CreateFunctionGroup("SharpDX.Direct3D11", "SharpDX.Direct3D11", "D3D11");
var d3dx11FunctionGroup = gen.CreateFunctionGroup("SharpDX.Direct3D11", "SharpDX.Direct3D11", "D3DX11");

// Map All D3D11 functions to D3D11 Function Group
gen.MapFunctionToFunctionGroup(@"^D3D11.*", "d3d11.dll", d3d11FunctionGroup);

// Map All D3DX11 functions to D3DX11 Function Group
gen.MapFunctionToFunctionGroup(@"^D3DX11.*", group.Find<cppmacrodefinition>("D3DX11_DLL_A").FirstOrDefault().StripStringValue, d3dx11FunctionGroup);

// Map All D3D11 functions to D3D11 Function Group
string d3dCompilerDll =
gen.MapFunctionToFunctionGroup(@"^D3DCreateBlob$", d3dCompilerDll, d3dCommonFunctionGroup);

If a DLL has a versionned name (like for D3DXX_xx.dll or D3DCompiler_xx.dll), we are directly retreiving the dll name from a macro!

Generate C# code from C# model and adding custom classes

Once an internal C# model is built, we are calling the T4 text template toolkit engine for each group of types : Enumerations, Structures, Interfaces, Functions. Those classes are then integrated in several VS project, with some custom code added and some non generated core classes.

The generated C# interop code

Meaning that for each assembly, each namespace, there will be an Enumerations.cs, Structures.cs, Interfaces.cs and Functions.cs files generated.

For each types, there is a custom mapping done:
  • For enums, the mapping is straightforward, resulting in an almost one-to-one mapping
  • For structures, the mapping is quite straightforward, resulting in an almost one-to-one mapping for most of the types. Although there are a couple of case where the mapping need to generate some marshalling code, essentially when there is a bool in the struct, or when there is a string pointer, or a fixed array of struct inside a struct.
For example, one of the most complex mapping for a structure is generated like this:

/// <summary> 
/// Describes the blend state.
/// </summary>
/// <remarks>
/// These are the default values for blend state.StateDefault ValueAlphaToCoverageEnableFALSEIndependentBlendEnableFALSERenderTarget[0].BlendEnableFALSERenderTarget[0].SrcBlendD3D11_BLEND_ONERenderTarget[0].DestBlendD3D11_BLEND_ZERORenderTarget[0].BlendOpD3D11_BLEND_OP_ADDRenderTarget[0].SrcBlendAlphaD3D11_BLEND_ONERenderTarget[0].DestBlendAlphaD3D11_BLEND_ZERORenderTarget[0].BlendOpAlphaD3D11_BLEND_OP_ADDRenderTarget[0].RenderTargetWriteMaskD3D11_COLOR_WRITE_ENABLE_ALL Note that D3D11_BLEND_DESC is identical to {{D3D10_BLEND_DESC1}}.If the driver type is set to <see cref="SharpDX.Direct3D.DriverType.Hardware"/>, the feature level is set to less than or equal to <see cref="SharpDX.Direct3D.FeatureLevel.Level_9_3"/>, and the pixel formatofthe render target is set to <see cref="SharpDX.DXGI.Format.R8G8B8A8_UNorm_SRgb"/>, DXGI_FORMAT_B8G8R8A8_UNORM_SRGB, or DXGI_FORMAT_B8G8R8X8_UNORM_SRGB, the display device performs the blend in standard RGB (sRGB) space and not in linear space. However, if the feature level is set to greater thanD3D_FEATURE_LEVEL_9_3, the display device performs the blend in linear space.
/// </remarks>
/// <unmanaged>D3D11_BLEND_DESC</unmanaged>
public partial struct BlendDescription {

/// <summary>
/// Determines whether or not to use alpha-to-coverage as a multisampling technique when setting a pixel to a rendertarget.
/// </summary>
/// <unmanaged>BOOL AlphaToCoverageEnable</unmanaged>
public bool AlphaToCoverageEnable {
get {
return (_AlphaToCoverageEnable!=0)?true:false;
set {
_AlphaToCoverageEnable = value?1:0;
internal int _AlphaToCoverageEnable;

/// <summary>
/// Set to TRUE to enable independent blending in simultaneous render targets. If set to FALSE, only the RenderTarget[0] members are used. RenderTarget[1..7] are ignored.
/// </summary>
/// <unmanaged>BOOL IndependentBlendEnable</unmanaged>
public bool IndependentBlendEnable {
get {
return (_IndependentBlendEnable!=0)?true:false;
set {
_IndependentBlendEnable = value?1:0;
internal int _IndependentBlendEnable;

/// <summary>
/// An array of render-target-blend descriptions (see <see cref="SharpDX.Direct3D11.RenderTargetBlendDescription"/>); these correspond to the eight rendertargets that can be set to the output-merger stage at one time.
/// </summary>
/// <unmanaged>D3D11_RENDER_TARGET_BLEND_DESC RenderTarget[8]</unmanaged>
public SharpDX.Direct3D11.RenderTargetBlendDescription[] RenderTarget {
get {
if (_RenderTarget == null) {
_RenderTarget = new SharpDX.Direct3D11.RenderTargetBlendDescription[8];
return _RenderTarget;
internal SharpDX.Direct3D11.RenderTargetBlendDescription[] _RenderTarget;

// Internal native struct used for marshalling
[StructLayout(LayoutKind.Sequential, Pack = 0 )]
internal unsafe partial struct __Native {
public int _AlphaToCoverageEnable;
public int _IndependentBlendEnable;
public SharpDX.Direct3D11.RenderTargetBlendDescription RenderTarget;
SharpDX.Direct3D11.RenderTargetBlendDescription __RenderTarget1;
SharpDX.Direct3D11.RenderTargetBlendDescription __RenderTarget2;
SharpDX.Direct3D11.RenderTargetBlendDescription __RenderTarget3;
SharpDX.Direct3D11.RenderTargetBlendDescription __RenderTarget4;
SharpDX.Direct3D11.RenderTargetBlendDescription __RenderTarget5;
SharpDX.Direct3D11.RenderTargetBlendDescription __RenderTarget6;
SharpDX.Direct3D11.RenderTargetBlendDescription __RenderTarget7;
// Method to free native struct
internal unsafe void __MarshalFree()

// Method to marshal from native to managed struct
internal unsafe void __MarshalFrom(ref __Native @ref)
this._AlphaToCoverageEnable = @ref._AlphaToCoverageEnable;
this._IndependentBlendEnable = @ref._IndependentBlendEnable;
fixed (void* __to = &this.RenderTarget[0]) fixed (void* __from = &@ref.RenderTarget) SharpDX.Utilities.CopyMemory((IntPtr) __to, (IntPtr) __from, 8*sizeof ( SharpDX.Direct3D11.RenderTargetBlendDescription));
// Method to marshal from managed struct tot native
internal unsafe void __MarshalTo(ref __Native @ref)
@ref._AlphaToCoverageEnable = this._AlphaToCoverageEnable;
@ref._IndependentBlendEnable = this._IndependentBlendEnable;
fixed (void* __to = &@ref.RenderTarget) fixed (void* __from = &this.RenderTarget[0]) SharpDX.Utilities.CopyMemory((IntPtr) __to, (IntPtr) __from, 8*sizeof ( SharpDX.Direct3D11.RenderTargetBlendDescription));


  • For Interfaces the mapping is quite complex, because it is necessary to handle lost of different cases:
    • Optionnal structure in input
    • Optionnal parameters
    • Output an array of interface
    • Perform some custom marshaling (for example, with the previous BlendDescription structure)
    • Generating properties for methods that are property elligible
    • ...etc.
For example, the method using the BlendDescription is like this:
/// <summary> 
/// Create a blend-state object that encapsules blend state for the output-merger stage.
/// </summary>
/// <remarks>
/// An application can create up to 4096 unique blend-state objects. For each object created, the runtime checks to see if a previous object has the same state. If such a previous object exists, the runtime will return a pointer to previous instance instead of creating a duplicate object.
/// </remarks>
/// <param name="blendStateDescRef">Pointer to a blend-state description (see <see cref="SharpDX.Direct3D11.BlendDescription"/>).</param>
/// <param name="blendStateRef">Address of a pointer to the blend-state object created (see <see cref="SharpDX.Direct3D11.BlendState"/>).</param>
/// <returns>This method returns E_OUTOFMEMORY if there is insufficient memory to create the blend-state object. See {{Direct3D 11 Return Codes}} for other possible return values.</returns>
/// <unmanaged>HRESULT CreateBlendState([In] const D3D11_BLEND_DESC* pBlendStateDesc,[Out, Optional] ID3D11BlendState** ppBlendState)</unmanaged>
public SharpDX.Result CreateBlendState(ref SharpDX.Direct3D11.BlendDescription blendStateDescRef, out SharpDX.Direct3D11.BlendState blendStateRef){
unsafe {
SharpDX.Direct3D11.BlendDescription.__Native blendStateDescRef_ = new SharpDX.Direct3D11.BlendDescription.__Native();
blendStateDescRef.__MarshalTo(ref blendStateDescRef_);
IntPtr blendStateRef_ = IntPtr.Zero;
SharpDX.Result __result__;
__result__= (SharpDX.Result)SharpDX.Interop.CalliInt(_nativePointer, 20 * 4, &blendStateDescRef_, &blendStateRef_); 
          Making of Ergon 4K PC Intro        
You are not going to discover any fantastic trick here, the intro itself is not an outstanding coding performance, but I always enjoy reading the making of other intros, so It's time to take some time to put this on a paper!

What is Ergon? It's a small 4k intro (meaning 4096 byte executable) that was released at the 2010 Breakpoint demoparty (if you can't run it on your hardware, you can still watch it on youtube), which surprisingly was able to finish to the 3rd place! I did the coding, design and worked also on the music with my friend ulrick.

That was a great experience even if I didn't expect to work on this production at the beginning of the year... but at the end of January, when BP2010 was announced and supposed to be the last one, I was motivated to go there, and why not, release a 4k intro! One month and a half later, the demo was almost ready... wow, 3 weeks before the party, first time to finish something so ahead an event! But yep, I was able to work on it on part time during the week (and the night of course)... But when I started on it, I had no idea where this project would bring me to... or even what kind of 3D API I had to start from doing this intro!

OpenGL, DirectX 9, 10 or 11?

At FRequency, xt95 is mainly working in OpenGL, mostly due to the fact that he is a linux user. All our previous intros were done using OpenGL, although I did provide some help on some intros, bought OpenGL books few years ago... I'm not a huge fan of the OpenGL C API, but most importantly, from my short experience on this, I was always able to better strip down DirectX code size than OpenGL code... At that time, I was also working a bit more on DirectX API... I even bought a 5770 ATI earlier to be able to play with D3D11 Compute Shader api... I'm also mostly a windows user... DirectX has a very well integrated documentation in Visual Studio, a good SDK, lots of samples inside, a cleaner API (more true on recent D3D10/D3D11), some cool tools like PIX to debug shaders... and thought also that programming on DirectX on windows might reduce the risk to get some incompatibilities between NVidia and ATI graphics card (although, I found that, at least with D3D9, this is not always true...).

So ok, DirectX was selected... but which version? I started my first implementation with D3D10. I know that the code is much more verbose than D3D9 and OpenGL2.0, but I wanted to practice it a bit more the somehow "new" API than just reading a book about it. I was also interested to plug some text in the demo and tried an integration with latest Direct2D/DirectWrite API.

Everything went well at the beginning with D3D10 API. The code was clean, thanks to the thin layer I developed around DirectX to make the coding experience much closer to what I use to have in C# with SlimDx for example. The resulting C++ code was something like this :
// Set VertexBuffer for InputAssembler Stage
device.InputAssembler.SetVertexBuffers(screen.vertexBuffer, sizeof(VertexDataOffline));

// Set TriangleList PrimitiveTopology for InputAssembler Stage

// Set VertexShader for the current Pass
Very pleasant to develop with it, but because I wanted to test D2D1, I switched to D3D10.1 which can be configured to run on D3D10 hardware (with the feature level thing)... So I also started to slightly wrap up the Direct2D API and was able to produce very easily some really nice text... but wow... the code was a bit too large for a 4k (but would be perfect for a 64k).

Then during this experiment phase, I tried the D3D11 API with the Compute Shader thing... and found that the code is much more compact than D3D10 if you are performing some kind of... for example, raymarching... I didn't compare code size, but I suspect the code to be able to compete with its D3D9 counterpart (although, there is a downside in D3D11 : if you can afford a real D3D11 hardware, a compute shader can directly render to the screen buffer... otherwise, using the D3D11 Compute shader with features level 10, you have to copy the result from one resource to another... which might hit the size benefit...).

I was happy to see that the switch to D3D11 was easy, with some continuity from D3D10 on the API "look & feel"... Although I was disappointed to learn that working this D3D11 and D2D1 was not straightforward because D2D1 is only compatible with D3D10.1 API (which you can run with feature level 9.0 to 10), forcing to initialize and maintain two devices (one for D3D10.1 and one for D3D11), playing with DXGI shared resource between the devices... wow, lots of work, lots of code... and of course, out of question for a 4k...

So I tried... a plain old good D3D9... and that was of course much compact in size than their D3D10 counterpart... So for around two weeks in February, I played with those various API while implementing some basic scene for the intro.I just had a bad surprise when releasing the intro, because lots of people were not able to run it : weird because I was able to test it on several NVidias and at least my ATI 5770... I didn't expect D3D9 to be so sensitive to that, or at least, a bit less sensitive than OpenGL... but I was wrong.

Raymarching optimization

I decided to go for an intro using the raymarching algorithm that was more likely to be able to deliver a "fat" content in a tiny amount of code. Although, the raymarching stuff was already a bit in the "retired", after the fantastic intros released earlier in 2009 (Elevated - not really a raymarching intro but soo impressive!, Sult, Rudebox, Muon-Baryon...etc). But I didn't have enough time to explore a new effect and was not even confident to be able to find anything interesting at that time... so... ok, raymarching.

So for one week, after building a 1st scene, I spent my time to try to optimize the raymarching algo. There was an instructive thread on pouet about this : "So, what do distance field equations look like? And how do we solve them?". I tried to implement some trick like...
  1. Generate grid on the vertex shader (with 4x4 pixels for example), to precompute a raw view of the scene, storing the minimal distance step to go before hitting a surface... let the pixel shader to get those interpolate distances (multiplied by a small reduction factor like .9f) and perform some fine grained raymarching with fewer iterations
  2. Generate a pre-rendered 3D volume of the scene at a much lower density (like 96x96x96) and use this map to navigate in the distance fields while still performing some "sphere tracing" refinement if needed
  3. I tried also somekind of level of detail on the scene : for example, instead of having a texture lookup (for the "bump mapping") for each step during the raymarching, allow the raymarcher to use a simplified analytical surface scene and switch to the more detailled one for the last step
Well, I have to admit that all those techniques were not really clever in anyway... and the result was matching the lack of this cleverness! None of them provide a significant speed optimization compare to the code size hit they were generated.

So after one week of optimization, well, I just went to a basic raymarcher algo. The shader was developed under Visual C++, integrated in the project (thanks to NShader syntax highlighting). I did a small C# tool to strip the shader comments, remove unnecessary spaces... integrated in the build (pre-build events in VC++), It's really enjoyable to work with this toolchain.

Scenes design

For the scenes, I decided to use the same kind of technique used in the Rudebox 4k intro : Leveraging more on the geometry and lights, but not on the materials. That made the success of the rudebox and I was motivated to build some complex CSG with boolean operations on basic elements (box, sphere...etc.). The nice thing about this approach is that It avoids to use inside the ISO surface anykind of if/then/else for determining the material... just letting the lights properly set in the scene might do the work. Yep, indeed, rudebox is for example a scene with say, a white material for all the objects. What makes the difference is the position of lights in the scene, their intensity...etc. Ergon used the same trick here.

I spent around two to three weeks to build the scenes. I ended up with 4 scenes, each one quite cool on their own, with a consistent design among them. One of the scene was using the fonts to render a wall of text in raymarching.

Because I'm not sure that I will be able to use those scenes, well, I'm going to post their screenshot here!

The 1st scene I developed during my D3D9/D3D10/D3D11 API experiments was a massive tentacle model coming from a balckhole. All the tentacles were moving around a weird cutted sphere, with a central "eye"... I was quite happy about this scene that had a unique design. From the beginning, I wanted to add some post-processing, to enhance the visuals, and to make them a bit different from other raymarching scene... So I went with a simple post-processing that was performing some patterns on the pixels, adding a radial blur to produce some kind of "ghost rays" coming out from the scene, making the corners darker, and adding a small flickering the more you go to the corners. Well, only this piece of code was already taking a scene on its own, but that was the price to have a genuine ambiance, so...

The colors and theming was almost settled from the beginning... I'm a huge fan of warm colors!

The 2nd scene was using a font rendering coupling with the raymarcher.... a kind of flying flag, with the logo FRequency appearing from left to right with a light on it... (I will probably release those effects on pouet just for the record...), that was also a fresh use of raymarching... didn't see anything like this in latest 4k production, so, I was expecting to insert this text in the 4k, as It's not so common... The code to use the d3d font was not too fat... so I was still confident to be able to use those 2 scenes.

After that, I was looking for some nasty objects... so for the 3rd scene, I tried to randomly play with some weird functions and ended up with a kind of "raptor" creature... I wanted also to use a weird generated texture I found few month ago, that was perfect for it.

Finally, I wanted to use the texture to make a kind of lava sea with a moving snake on it... that was the last scene I coded (and of course, 2 others scenes that are too ugly to show here! :) ).

We also started at that time, in February, to work on the music, and as I explained in my earlier posts, we used 4klang synth for the intro. But making all those scenes with a music prototype, the "crinklered" compressed exe was more around 5ko... even If the shader code was already optimized in size, using some kind of preprocessor templating (like in rudebox or receptor). The intro was of course laking a clear direction, there was no transitions between the scenes... and most importantly, It was not possible to fit all those scenes in 4k, while expecting the music to grow a little bit more in the final exe...

The story of the Worm-Lava texture

Last year, around November, while I was playing with several perlin's like noise, I found an interesting variation using perlin noise and the marble-cosine effect that was able to represents some kind of worms, quite freaking ugly in some way, but that was a unique texture effect!

(Click to enlarge, lots of details in it!)

This texture was primarily developed in C# but the code was quite straightforward to port in a texture shader... Yep, that's probably an old trick with D3D9 to use the function D3DXFillTextureTX to directly fill a texture from a shader with a single line of code... Why using this? Because It was the only way to get a noise() function accessible from a shader, without having to implement it... As weird as it may sounds, the HLSL perlin noise() function is not accessible outside a texture shader. A huge drawback of this method is also that the shader is not a real GPU shader, but is instead computed on the CPU... that explain why ergon intro is taking so long to generate the texture at the beginning (with a 1280x720 texture resolution for example).

So how does look this texture shader in order to generate this texture?
// -------------------------------------------------------------------------
// worm noise function
// -------------------------------------------------------------------------
#define ty(x,y) (pow(.5+sin((x)*y*6.2831)/2,2)-.5)
#define t2(x,y) ty(y+2*ty(x+2*noise(float3(cos((x)/3)+x,y,(x)*.1)),.3),.7)
#define tx(x,y,a,d) ((t2(x, y) * (a - x) * (d - y) + t2(x - a, y) * x * (d - y) + t2(x, y - d) * (a - x) * y + t2(x - a, y - d) * x * y) / (a * d))

float4 x( float2 x : position, float2 y : psize) : color {
float a=0,d=64;
// Modified FBM functions to generate a blob texture
a += abs(tx(x.x*d,x.y*d,d,d)/d);
return a*2;

The tx macro is basically applying a tiling on the noise.
The core t2 and ty macros are the one that are able to generate this "worm-noise". It's in fact a tricky combination of the usual cosine perlin noise. Instead of having something like cos(x + noise(x,y)), I have something like special_sin( y + special_sin( x + noise(cos(x/3)+x,y), power1), power2), with special_sin function like ((1 + sin(x*power*2*PI))/2) ^ 2

Also, don't be afraid... this formula didn't came out of my head like this... that was clearly after lots of permutations from the original function, with lots of run/stop/change_parameters steps! :D

Music and synchronization

It took some time to build the music theme and to be satisfied with it... At the beginning, I let ulrick making a first version of the music... But because I had a clear view of the design and direction, I was expecting a very specific progression in the tune and even in the chords used... That was really annoying for ulrick (excuse-me my friend!), as I was very intrusive in the composition process... At some point, I ended up in making a 2 pattern example of what I wanted in terms of chords and musical ambiance... and ulrick was kind enough to take this sample pattern and clever to add some intro's musical feeling in it. He will be able to talk about this better than me, so I'll ask him if he can insert some small explanation here!

ulrick here: « working with @lx on this prod was a very enjoyable job. I started a music which @lx did not like very much, it did not reflect the feelings that @lx wanted to give through the Ergon. He thus composed a few patterns using a very emotional musical scale. I entered into the music very easily and added my own stuffs. For the anecdote, I added a second scale to the music to allow for a clearer transition between the first and second parts of the Ergon. After doing so, we realized that our music actually used the chromatic scale on E »

The synchronization was the last part of the work in the demo. I first used the default synchronization mechanism from the 4klang... but I was lacking some features like, if the demo is running slowly, I needed to know exactly where I was... Using plain 4klang sync, I was missing some events on slow hardware, even preventing the intro to switch between the scenes, because the switching event was missed by the rendering loop!

So I did my own small synchronization based on regular events of the snare and a reduce view of the sample patterns for this particular events. This is the only part of the intro that was developed in x86 assembler in order to keep it as small as possible.

The whole code was something like this :
static float const_time = 0.001f;
static int SMOOTHSTEP_FACTOR = 3;

static unsigned char drum_flags[96] = {
// pattern n° time z.z sequence
1,1,1,1, // pattern 0 0 0 0
1,1,1,1, // pattern 1 7,384615385 4 1
0,0,0,0, // pattern 2 14,76923077 8 2
0,0,0,0, // pattern 3 22,15384615 12 3
0,0,0,0, // pattern 4 29,53846154 16 4
0,0,0,0, // pattern 5 36,92307692 20 5
0,0,0,0, // pattern 6 44,30769231 24 6
0,0,0,0, // pattern 7 51,69230769 28 7
0,0,0,1, // pattern 8 59,07692308 32 8
0,0,0,1, // pattern 8 66,46153846 36 9
1,1,1,1, // pattern 9 73,84615385 40 10
1,1,1,1, // pattern 9 81,23076923 44 11
1,1,1,1, // pattern 10 88,61538462 48 12
0,0,0,0, // pattern 11 96 52 13
0,0,0,0, // pattern 2 103,3846154 56 14
0,0,0,0, // pattern 3 110,7692308 60 15
0,0,0,0, // pattern 4 118,1538462 64 16
0,0,0,0, // pattern 5 125,5384615 68 17
0,0,0,0, // pattern 6 132,9230769 72 18
0,0,0,0, // pattern 7 140,3076923 76 19
0,0,0,1, // pattern 8 147,6923077 80 20
1,1,1,1, // pattern 12 155,0769231 84 21
1,1,1,1, // pattern 13 162,4615385 88 22

// Calculate time, synchro step and boom shader variables
__asm {
fild dword ptr [time] // st0 : time
fmul dword ptr [const_time] // st0 = st0 * 0.001f
fstp dword ptr [shaderVar.x] // shaderVar.x = time * 0.001f
mov eax, dword ptr [MMTime.u.sample]
jae not_first_drum
xor eax,eax
idiv dword ptr [SAMPLES_PER_DRUMS] // eax = drumStep , edx = remainder step
mov dword ptr [drum_step], eax
fild dword ptr [drum_step]
fstp dword ptr [shaderVar.z] // shaderVar.z = drumStep

not_end: cmp byte ptr [eax + drum_flags],0
jne no_boom

sub eax,edx
jae boom_ok
xor eax,eax
mov dword ptr [shaderVar.y],eax
fild dword ptr [shaderVar.y]
fidiv dword ptr [SAMPLES_PER_DROP_DRUMS] // st0 : boom
fild dword ptr [SMOOTHSTEP_FACTOR] // st0: 3, st1-4 = boom
fsub st(0),st(1) // st0 : 3 - boom , st1-3 = boom
fsub st(0),st(1) // st0 : 3 - boom*2, st1-2 = boom
fmul st(0),st(1) // st0 : boom * (3-boom*2), st1 = boom
fmulp st(1),st(0)
fstp dword ptr [shaderVar.y]

That was smaller then what I was able to do with pure 4klang sync... with the drawback that the sync was probably too simplistic... but I couldn't afford more code for the sync... so...

Final mixing

Once the music was almost finished, I spent a couple of days to work on the transitions, sync, camera movements. Because It was not possible to fit the 4 scenes, I had to mix the scene 3 (the raptor) and 4 (the snake and the lava sea), found a way to put a transition through a "central brain". Ulrick wanted to put a different music style for the transition, I was not confident with it... until I put the transition in action, letting the brain collapsed while the space under it was digging all around... and the music was fitting very well! cool!

I did also use a simple big shader for the whole intro, with some if (time < x) then scene_1 else scene_2...etc. I didn't expect to do this, because this is hurting the performance in the pixel shader to do this kind of branch processing... But I was really running out of space here and the only solution was in fact to use a single shader with some repetitive code. Here is an excerpt from the shader code : You can see how scene and camera management has been done, as well as for lights. This part was compressing quite well due to its repetitive pattern.
// -------------------------------------------------------------------------

// t3

// Helper function to rotate a vector. Usage :

// t3(mypoint.xz, .7); <= rotate mypoint around Y axis with .7 radians
// -------------------------------------------------------------------------
float2 t3(inout float2 x,float y){
return x=x*cos(y)+sin(y)*float2(-x.y,x.x);

// -------------------------------------------------------------------------
// v : main raymarching function
// -------------------------------------------------------------------------
float4 v(float2 x:texcoord):color{
float a=1,b=0,c=0,d=0,e=0,f=0,i;
float3 n,o,p,q,r,s,t=0,y;
int w;
r=normalize(float3(x.x*1.25,-x.y,1)); // ray
x = float2(.001,0); // epsilon factor

// Scene management
if (z.z<39) {
w = (z.z<10)?0:(z.z>26)?3+int(fmod(z.z,5)):int(fmod(z.z,3));

if (w==0) { p=float3(12,5+30*smoothstep(16,0,z.x),0);t3(r.yz,1.1*smoothstep(16,0,z.x));t3(r.xz,1.54); }
if (w==1) { p=float3(-13,4,-8);t3(r.yz,.2);t3(r.xz,-.5);t3(r.xy,sin(z.x/3)/3); }
if (w==2) { p=float3(0,8.5,-5);t3(r.yz,.2);t3(r.xy,sin(z.x/3)/5); }
if (w==3) {
t3(r.yz, sin(z.x/5)*.6);
t3(r.xz, 1.54+z.x/5);
t3(r.xy, cos(z.x/10)/3);

if (w == 4) {
t3(r.yz, sin(z.x/5)/5);
t3(r.xz, 1.54+z.x/3);
t3(r.xy, sin(z.x/10)/3);

if (w > 4) {
t3(r.yz, 1.54*sin(z.x/5));
t3(r.xz, .7+z.x/2);
t3(r.xy, sin(z.x/10)/3);
} else if (z.z<52) {
t3(r.yz, .9);
t3(r.xz, 1.54+z.x/4);
} else if (z.z<81) {
w = int(fmod(z.z,3));
if (w==0 ) {
t3(r.yz, sin(z.x/5)/5);
t3(r.xz, 1.54+z.x/3);
t3(r.xy, sin(z.x/10)/3);
if (w==1 ) {
t3(r.yz, 1.1);
t3(r.xz, z.x/4);
if (w==2 ) {
t3(r.yz, sin(z.x/5)/2);
t3(r.xz, 1.54+z.x/5);
t3(r.xy, cos(z.x/10)/3);
} else {

// Boom effect on camera

// Lights
static float4 l[6] = {{.7,.2,0,2},{.7,0,0,3},{.02,.05,.2,7},

Compression statistics

Final compression results are given in the following table:

So to summarize, total exe size is 4070 bytes, and is composed of :
  • Synth code + music data is taking around 35% of the total exe size = 1461 bytes
  • Shader code is taking 36% = 1467 bytes
  • Main code + non shader data is 14% = 549 bytes
  • PE + crinkler decoder + crinkler import is 15% = 593 bytes

The intro was finished around the 13 march 2010, well ahead BP2010. So that was damn cool... I spent the rest of my time until BP2010 to try to develop a procedural 4k gfx, using D3D11 compute shaders, raymarching and a Global Illumination algorithm... but the results (algo finished during the party) disappointed me... And when I saw the fantastic Burj Babil by Psycho, he was right about using a plain raymarcher without any complicated true light management... a good "basic" raymarching algo, with some tone mapping finetune was much more relevant here!

Anyway, my GI experiment on the compute shader will probably deserve an article here.

I really enjoyed to make this demo and to see that ergon was able to make it in the top 3... after seeing BP2009, I was not expecting at all the intro to be in the top 3!... although I know that the competition this year was far much easier than the previous BP!

Anyway, that was nice to work with my friend ulrick... and to contribute to the demoscene with this prod. I hope that I will be able to keep on working on the demos like this... I still have lots of things to learn, and that's cool!
          Democoding, tools coding and coding scattering        
Not so much post here for a while... So I'm going to just recap some of the coding work I have done so far... you will notice that It's going in lots of direction, depending on opportunities, ideas, sometimes not related to democoding at all... not really ideal when you want to release something! ;)

So, here are some directions I have been working so far...

C# and XNA

I tried to work more with C#, XNA... looking for an opportunity to code a demo in C#... I even started a post about it few months ago, but leaving it in a draft state. XNA is really great, but I had some bad experience with it... I was able to use it without requiring a full install but while playing with model loading, I had a weird bug called the black model bug. Anyway, I might come back to C# for DirectX stuff... SlimDx is for example really helpful for that.

A 4k/64k softsynth

I have coded a synth dedicated to 4k/64k coding. Although, right now, I only have the VST and GUI fully working under Renoise.. but not yet the asm 4k player! ;)

The main idea was to build a FM8/DX7 like synth, with exactly the same output quality (excluding some fancy stuff like the arpegiator...). The synth was developed in C# using vstnet, but must be more considered as a prototype under this language... because the asm code generated by the JIT is not really good when it comes to floating point calculation... anyway, It was really good to develop under this platform, being able to prototype the whole thing in few days (and of course, much more days to add rich GUI interaction!).

I still have to add a sound library file manager and the importer for DX7 patch..... Yes, you have read it... my main concern is to provide as much as possible a tons of ready-to-use patches for ulrick (our musician at FRequency)... Decoding the DX7 patch is well known around the net... but the more complex part was to make it decode like the FM8 does... and that was tricky... Right now, every transform functions are in an excel spreadsheet, but I have to code it in C# now!

You may wonder why developing the synth in C# if the main target is to code the player in x86 asm? Well, for practical reasons : I needed to quickly experiment the versatility of the sounds of this synth and I'm much more familiar with .NET winform to easily build some complex GUI. Although, I have done the whole synth with 4k limitation in mind... especially about data representation and complexity of the player routine.

For example, for the 4k mode of this synth, waveforms are strictly restricted to only one : sin! No noise, no sawtooth, no square... what? A synth without those waveform?.... but yeah.... When I looked back at DX7 synth implem, I realized that they were using only a pure "sin"... but with the complex FM routing mechanism + the feedback on the operators, the DX7 is able to produce a large variety of sounds ranging from strings, bells, bass... to drumkits, and so on...

I did also a couple of effects, mainly a versatile variable delay line to implement Chorus/Flanger/Reverb.

So basically, I should end up with a synth with two modes :
- 4k mode : only 6 oscillators per instrument, only sin oscillators, simple ADSR envelope, full FM8 like routing for operators, fixed key scaling/velocity scaling/envelope scaling. Effects per instrument/global with a minimum delay line + optional filters. and last but not least, polyphony : that's probably the thing I miss the most in 4k synth nowadays...
- 64k mode : up to 8 oscillators per instrument, all FM8 oscillators+filters+WaveShaping+RingModulation operators, 64 steps FM8's like envelope, dynamic key scaling/velocity scaling/envelope scaling. More effects, with better quality, 2 effect //+serial line per instrument. Additional effects channel to route instrument to the same effects chain. Modulation matrix.

The 4k mode is in fact restricting the use of the 64k mode, more at the GUI level. I'm currently targeting only the 4k mode, while designing the synth to make it ready to support 64k mode features.

What's next? Well, finish the C# part (file manager and dx7 import) and starting the x86 asm player... I just hope to be under 700 compressed byte for the 4k player (while the 64k mode will be written in C++, with an easier limitation around 5Ko of compressed code) .... but hey, until It's not coded... It's pure speculation!.... And as you can see, the journey is far from finished! ;)

Context modeling Compression update

During this summer, I came back to my compression experiment I did last year... The current status is quite pending... The compressor is quite good, sometimes better than crinkler for 4k... but the prototype of the decompressor (not working, not tested....) is taking more than 100 byte than crinkler... So in the end, I know that I would be off more than 30 to 100 byte compared to crinkler... and this is not motivating me to finish the decompressor and to get it really running.

The basic idea was to take the standard context modeling approach from Matt Mahoney (also known as PAQ compression, Matt did a fantastic job with his research, open source the way), using dynamic neural network with an order of 8 (8 byte context history), with the same mask selection approach than crinkler + some new context filtering at the bit level... In the end, the decompressor is using the FPU to decode the whole thing... as it needs ln2() and pow2() functions... So during the summer, I though using another logistic activation function to get rid of the FPU : the standard sigmoid used in the neural network with a base 2 is 1/(1+2^-x)), so I found something similar with y = (x / (1 + |x|) + 1) /2 from David Elliot (some references here). I didn't have any computer at this time to test it, so I spent few days to put some math optimization on it, while calculating the logit function (the inverse of this logistic function).

I came back to home very excited to test this method... but I was really disappointed... the function had a very bad impact on compression ratio by a factor of 20%, in the end, completely useless!

If by next year, I'm not able to release anything from this.... I will put all this work open source, at least for educational purposes... someone will certainly be clever than me on this and tweak the code size down!

SlimDx DirectX wrapper's like in C++

Recall that for the ergon intro, I have been working with a very thin layer around DirectX to wrap enums/interfaces/structures/functions. I did that around D3D10, a bit of D3D11, and a bit of D3D9 (which was the one I used for ergon). The goal was to achieve a DirectX C# like interface in C++. While the code has been coded almost entirely manually, I was wondering If I could not generate It directly from DirectX header files...

So for the last few days, I have been a bit working on this... I'm using boost::wave as the preprocessor library... and I have to admit that the C++ guy from boost lost their mind with templates... It's amazing how they did something simple so complex with templates... I wanted to use this under a C++/Cli managed .NET extension to ease my development in C#, but I end up with a template error at link stage... an incredible error with a line full of concatenated template, even freezing visual studio when I wanted to see the errors in the error list!

Template are really nice, when they are used not too intensively... but when everything is templatized in your code, It's becoming very hard to use fluently a library and It's sometimes impossible to understand the template error, when this error is more than 100 lines full of cascading template types!

Anyway, I was able to plug this boost::wave in a native dll, and calling it from a C# library... next step is to see how much I can get from DirectX header files to extract a form of IDL (Interface Definition Language). If I cannot get something relevant in the next week, I might postpone this task when I won't have anything more important to do! The good thing is for example for D3D11 headers, you can see that those files were auto-generated from a mysterious... d3d11.idl file...used internally at Microsoft (although It would have been easier to get directly this file!)... so It means that the whole header is quite easy to parse, as the syntax is quite systematic.

Ok, this is probably not linked to intros... or probably only for 64k.... and I'm not sure I will be able to finish it (much like rmasm)... And this kind of work is keeping me away from directly working with DirectX, experimenting rendering techniques and so on... Well, I have to admit also that I have been more attracted for the past few years to do some tools to enhance coding productivity (not necessary only mine)... I don't like to do too much things manually.... so everytime there is an opportunity to automatize a process, I can't refrain me to make it automatic! :D

AsmHighlighter and NShader next update

Following my bad appetite for tools, I need to make some update to AsmHighlighter and NShader, to add some missing keywords, patch a bug, support for new VS2010 version... whatever... When you release this kind of open source project, well, you have to maintain them, even if you don't use them too much... because other people are using them, and are asking for improvements... that's the other side of the picture...

So because I have to maintain those 2 projects, and they are in fact sharing logically more than 95% of the same code, I have decided to merge them into a single one... that will be available soon under codeplex as well. That will be easier to maintain, ending with only one project to update.

The main features people are asking is to be able to add some keywords easily and to map file extensions to the syntax highlighting system... So I'm going to generalize the design of the two project to make them more configurable... hopefully, this will cover the main features request...

An application for Windows Phone 7... meh?

Yep... I have to admit that I'm really excited by the upcoming Windows Phone 7 metro interface... I'm quite fed up with my iPhone look and feel... and because the development environment is so easy with C#, I have decided to code an application for it. I'm starting with a chromatic tuner for guitar/piano/violins...etc. and it's working quite well, even if I was able to test it only under the emulator. While developing this application, I have learned some cool things about pitch detection algorithm and so on...

I hope to finish the application around september, and to be able to test it with a real hardware when WP7 will be offcialy launched... and before puting this application on the windows marketplace.

If this is working well, I would study to develop other applications, like porting the softsynth I did in C# to this platform... We will see... and definitely, this last part is completely unrelated to democoding!

What's next?

Well, I have to prioritize my work for the next months:
  1. Merge AsmHighlighter and NShader into a single project.
  2. Play a bit for one week with DirectX headers to see if I could extract some IDL's like information
  3. Finish the 4k mode of the softsynth... and develop the x86 asm player
  4. Finish the WP7 application
I still have also an article to write about ergon's making of, not much to say about it, but It could be interesting to write down on a paper those things....

I need also to work on some new directX effects... I have played a bit with hardware instantiating, compute shaders (with a raymarching with global illumination for a 4k procedural compo that didn't make it to BP2010, because the results were not enough impressive, and too slow to calculate...)... I would really want to explore more about SSAO things with plain polygons... but I didn't take time for that... so yep, practicing more graphics coding should be on my top list... instead of all those time consuming and - sometimes useful - tools!
          By: ildar        
JavaScript is special programming language with own features. I don't see urgent reasons to embed features from other languages. But if you really want or you need implement privates to your classes, you need strongly think -- do you really this feature. Suggested examples with inner functions are weird and ugly because you create the public method to access to privates. Just you cover real privates with public method. I don't have objection if you really need this. I can suggest the simpler code: // Example 1 function A() { var privates = {}; this.getPrivates = function() { return privates; }; }; A.prototype.setX = function(x) { return this.getPrivates().x = x; }; A.prototype.getX = function() { return this.getPrivates().x; }; // Example 2 function A() { this.getPrivates = function() { return arguments.callee.privates = arguments.callee.privates || {}; }; }; A.prototype.setX = function(x) { return this.getPrivates().x = x; }; A.prototype.getX = function() { return this.getPrivates().x; }; Of course, we declare an inner method in the both examples but this code is simpler and laconic.
          By: Duarte Cunha Leão        
When I found your article I thought: "I'm gonna give up, someone has already tried it and failed!" The pattern you present I also got to it, and also found it was flawed... But I am so stubborn...And continued trying new variants of the pattern. I finally found a way to achieve private instance state (and with normal prototype methods). Check it out:
          By: Alan Pearce        
The technique below creates the guessPassword method on the Cartman objects prototype and keeps the "secret" variable private while only creating it once. function Cartman(){ //This is called only for the first instance var secret = 'Tea party'; Cartman=function(){ //This becomes the new constructor } Cartman.prototype.guessSecret = function(guess) { return guess == secret; }; return new Cartman(); }
          Harley Davidson brand motorcycles launched in India.        

  • The world's most iconic motorcycle brand Harley Davidson has finally entered the Indian territory with its elite range of motorcycles. The American bike maker has launched its products in India at the Auto Expo 2010, New Delhi. The company introduced 12 bikes starting from its street based Sportster series to its premium touring model, The Ultra Classic Electra Glide.The bookings for the Harley Davidson motorcycles will start from April 2010. 
  • The bikes will be imported in CBU condition and sold through select Harley Davidson outlets in New Delhi, Bangalore, Chennai, Hydrabad and Chandigarh. The deliveries will commence from June 2010 once the dealer networks start their operations.
  • The company is also planning to provide good financial options to the buyers as the aim is to introduce the Harley Davidson brand in India
  • The bikes will be competitively priced starting from 6.95 lakhs ex showroom, New Delhi.
Reblog this post [with Zemanta]

          Weekly Recap: BMW rolls out ambitious plug-in hybrid electric plan        
Filed under: Hybrid , Performance , Truck , Technology , Aston Martin , BMW , Chevrolet , Toyota , Electric , Luxury , Emissions "We believe that for the United States, this is going to be very important." - Julian Arguelles Let there be no doubt, BMW is serious about electric vehicles. The German automaker said this week it will make plug-in hybrid versions of all of its core models, an aggressive move that demonstrates its commitment to electric propulsion systems. BMW did not specify which vehicles will get the plug-in systems or provide a timeline for when they will arrive. But the announcement is clearly more than blustering, and the company revealed a 3 Series plug-in prototype this week at an event in France. BMW said the 3 Series uses a version of its 2.0-liter turbocharged four-cylinder engine (240 horsepower, 300 pound feet of torque) with an electric motor sandwiched between the engine and transmission in place of the torque converter. It has an all-electric range of 22 miles. A plug-in X5 with the same powertrain was also displayed alongside the 3 Series, though the X5 has been on the auto-show circuit for more than a year, including a recent stop in Los Angeles . Continue reading Weekly Recap: BMW rolls out ambitious plug-in hybrid electric plan Weekly Recap: BMW rolls out ambitious plug-in hybrid electric plan originally appeared on Autoblog on Sat, 06 Dec 2014 11:59:00 EST. Please see our terms for use of feeds . Permalink
          Week 8        
This week was dedicated to our first prototype for Level 1. The playtest we held on Friday gave the team much confidence. Progress Art style was locked on early this week. Since the game has aimed to create expressiveness for different kinds of players, we decided to go with the simple, neat style that would […]
          Week 7        
This week was productive. Our team completed the first prototype and tested within our team. It was a good start. Progress After a short discussion on Tuesday about our deliverables by the end of this week, we agreed that the first prototype should be finished by Thursday. Our programmers, Xiao, Albert and Kiran were working on […]
          Conversion Case Stories - Sink and Drink Table        
Conversion Case Stories:  Sink and Drink Table

The founders of Liquid Games, LLC decided to take a popular game called beer pong and develop a better product than was available. Their concept was to take a traditionally boring concept – a standard folding banquet table with little or no graphics on it – and take it to the next level by introducing tracking neon lights, illuminated pockets to light up the cups, and other physical features that also included another popular game known as Flip Cup. The idea was not only to create a more enjoyable gaming table but also to make a table more suitable for bars and nightclubs. Their new table was a wood composite construction with internal LED lights that required hands-on skill in carpentry that they built in their garage woodshop. The new table was an immediate hit, and became a popular traveling fixture at local bars and parties, so they decided to commercially market their new table. Upon publishing their new website they were immediately faced with a problem every retailer loves to have – too many sales, and they immediately knew that they would not be able to keep up with sales by constructing each table out of wood. They needed to find a fabrication process and a company capable of converting the wood-constructed table to a medium-volume manufacturing product that could be produced a cost . . . . Plastic was the obvious choice and vacuum forming was the most advantages process due to low tooling and production costs.

Our first meeting involved literally setting up the table in our conference room and saying “How can we build this in plastic?” A comprehensive prototype program was laid out, and we immediately began designing the new generation “Sink and Drink” table. Within a week several designs were evaluated, and a final concept design was agreed upon. Our engineering team generated a solid model CAD assembly that consists of three vacuum formed ABS and clear PETG components and a CNC-cut PVC foam board panel. Upon approval of the concept photorealistic renderings generated from the solid model CAD data, “soft” tooling (wood molds) was produced in Techniform’s in-house tooling shop and first article vacuum formed samples were made. Upon approval of the prototype “Sink and Drink” tables, production temperature controlled aluminum tooling was developed and Liquid Game was able re-launched their website showcasing their new “Sink and Drink” vacuum formed tables within a few months of initializing the development program. The product won “Best Invention of the Americas” at the INPEX New Production and Invention Expo in Pittsburgh, PA, and was also featured on the Tonight Show with Jay Leno.

Gallery Photo      |

Learn more at our website or call us today at 1-800-691-2816 to discuss your next project or get a copy of our capabilities brochure!
© 2011 Techniform Industries Michael Robinette

          Thermoforming Conversion Case Stories - Technically Speaking!        
Of the many advantages that plastic thermoforming offers perhaps the most advantageous is the ability to covert products previously produced using other materials into a thermoplastic component more suitable for the application. The reasons for converting products into plastics can range from production economics to performance and aesthetics characteristics.

Typical conversion development processes are as follows:
   - Sheet Metal to Plastic
   - Fiberglass to Plastic
   - Wood Fabrication to Plastic
   - Plastic Sheet Fabrication to Plastic

Because thermoform tooling is very inexpensive relative to other plastic fabrication process like injection molding or blow molding and molds can often be developed without CAD data or technical drawings the process of converting to plastics is both economical and fast. In many cases, prototypes can be molded from “soft” tooling (wood or epoxy) that can be produced at a very low cost during a very fast turnaround time – many times within a few days.  This keeps the risk factor low and keeps the critical time to market short.

Learn more at our website or call us today at
1-800-691-2816 to discuss your next project or get a copy of our capabilities brochure!

© 2011 Techniform Industries Michael Robinette

          Blog Post: Sid Meier Charts A Flight Plan For Ace Patrol        

Last week, we posted our interview with the lead designers of Haunted Hollow, but Firaxis has another mobile game up its sleeve for later this week. If you’re trying to decide whether to invest time in the upcoming World War I flight/strategy game, Sid Meier had the following to share about the title, on which he served as lead designer. [Excerpt]

Thanks for taking some time to answer questions about the new project! 

My pleasure!

What is your title and role on this project? 

I’m Sid Meier, and I’m the lead designer and one of the programmers on Ace Patrol. I’m also Creative Director at Firaxis Games.

What’s the top-level game concept for Ace Patrol? 

Ace Patrol is a game that puts you in command of some cool World War I biplanes in turn-based combat against enemy aircraft and aces. You’ll need to choose your maneuvers carefully each turn in order to protect your squadron and pilots, get the drop on the enemy, and succeed in your mission. Along the way you’ll be upgrading your aircraft and trying new aircraft models, as you experience a variety of missions along the Western Front. 

What makes the game a good fit for iOS? 

Turn-based combat works very well on iOS. Also the missions are a good, bite-sized length for mobile gaming, lasting maybe five to fifteen missions. Bigger missions might take a bit longer. You can play through a whole campaign in an hour or two, or you can play intermittently through a couple of days. The concept works well technically on these devices, and we’ve really tried to make the touch controls feel intuitive and easy. Touch controls feel like a natural fit for turn-based games, because you’re moving one thing at a time as you play the game.

Many gamers are familiar with Firaxis’ work on XCOM and Civilization. What features does Ace Patrol share with those games, and in what ways does it move off in other directions? 

Primarily, Ace Patrol is a game that is aimed at a similar audience, strategy gamers who like to think ahead, make a plan, try out lots of different approaches to the game, and then discuss and compare what they do with other players. You get to use cool hardware in our games, whether it’s a biplane in Ace Patrol or alien-technology-based guns in XCOM. There are a lot of cool pieces that you get, but you get to put them together in your own strategy, and then watch yourself improve as you play more and more. 

World War I-era fighter planes seems like a fascinating game setting, but an unusual one for turn-based strategy. Without the benefit of cover, buildings, and ground landmarks, how does Ace Patrol remain strategically interesting? 

That’s a very good question! We actually did find cover in clouds, and we found buildings in the various targets around the world, such as anti-aircraft gun emplacements, which make certain areas dangerous or safe to fly in. Part of the strategy might be luring the enemy into range of your anti-aircraft guns, so the map actually has a really good sense of safety, danger, and cover – all the basic elements that cover provides you in a ground combat game. The map is an important part of the game. There’s also positioning in whether you’re lower or higher than your target, and that has implications for attacking and maneuvering, and so the world is pretty interesting when you’re playing in it, and is definitely part of the tactics and strategy. The interaction of the planes with each other, the landmarks, and the ground is a constantly changing, fascinating strategic problem. 

[Next up: Upgrades, improvements, and multiplayer]


How does the game progress over time in terms of upgrades, leveling up, or improvements? 

You’ll get lots of upgrades. Your aircraft get better over time, and you’ll have the opportunity to progress to better aircraft as well as upgrade the equipment that your pilots use, such as gunsights, armor, and better guns. Your pilots definitely improve over time as they learn maneuvers, and you’ll build your tactics based on the arsenal of maneuvers each pilots knows. One of the fundamental concepts is almost this collectible-card-game-style building up of pilots, moves, and equipment in determining your approach to any given mission. A certain pilot might be good at looping maneuvers, or the famous Immelman turn, and another might know slips or skids. The game also progresses in that you can try different campaigns, like the Germans or the Americans.

Does Ace Patrol have a structured story you’re playing through, or is it more about individual, disconnected scenarios/battles? If there is a story, who are you playing, and what’s the thrust of the plotline? 

It’s a campaign, and you create the story, but the individual battles and missions are connected in the sense that the improvements you earn carry over from one to another. Damaged planes might have to sit out a mission while being repaired, and captured pilots might not come back for a while. It’s very much a connected story, but it’s one you write yourself as you decide which missions to take and which pilots to bring with you. 

What can you tell me about the different types of planes or pilots you control? 

All of the aircraft are based on historical prototypes in terms of their capabilities, speed, firepower, and maneuverability. All the really cool planes are there, and there are some very famous and unique planes from that era, like the Fokker Dr.1 or the Sopwith Camel. Designers in those days didn’t have computer aided design or wind tunnels, and so there was a lot of experimentation which led to a variety of aircraft types appearing in the skies. You get to fly with and against all these cool and unique planes. That’s very appealing from a gaming perspective because of the variety that offers the player. 

Pilots are also individuals who improve over time with new maneuvers. They keep a journal or log of their experiences during the battle, so you get some sense of how they’re experiencing the campaign. You do develop a connection with them, in a similar way you get attached to your soldiers in XCOM, in terms of their experiences and abilities and how you rely on them in creating a tactical plan.

What’s the visual style the team is shooting for in the game? Does it lean towards realism, or something more exaggerated or cartoony? 

It’s a middle ground which tries to capture some of the glamor and innocence, the bright color and design of the World War I plane. These are really iconic, like the Red Baron’s aircraft. There’s a more realistic looking world to fly over, with fields and trenches. It’s a blend of realism and some dramatic and colorful aircraft.

Any plans to include multiplayer? If so, how will multiplayer work? 

We have asynchronous multiplayer where each player creates their own squadron, planes, pilots, and abilities, and then goes head-to-head with another squadron controlled by another player. There’s also a “hot pad” mode where two players are on the same device, and these are quicker, instant-action style games where you’re playing more informally for five to fifteen minutes. 

It sounds like the plan is for Ace Patrol to be free-to-play. In what ways do you hope to monetize the gaming experience? 

Our free-to-play model is more like the “try the demo, if you like it, buy the game” model. You get a number of missions for free which give you a good feel for the game. If you want to go deeper, then we have campaigns that are offered within the game. You can also unlock aces, or improvements to help keep your squadron operating efficiently, such as ambulances to recover downed pilots.

          Inspiring Young Minds to be Innovators and Pursue their Dreams        
Photo of Joe Matal at Camp Invention in Hyattsville, MD.

In connection with American Dream Week (July 31—August 4, 2017), the U.S. Department of Commerce is proud to highlight the important role Commerce agencies play in creating jobs and economic opportunities in American communities across the nation.   

Blog by Joe Matal, Performing the Functions and Duties of the Under Secretary of Commerce for Intellectual Property and Director of the USPTO

At Camp Invention, almost two million students have explored their own innate creativity, inventiveness and entrepreneurial spirit in a week-long day camp program that’s been running annually since 1990. Currently held at more than 1,400 sites in 50 states for kindergarten through 6th grade, these students are learning how to think big, be innovators and pursue their dreams.

Camp Invention is a partnership between the United States Patent and Trademark Office (USPTO) and the National Inventors Hall of Fame. The program includes a robust STEM (science, technology, engineering and mathematics) curriculum while also providing insights into the role of patents and trademarks in innovation. Children develop questions, collect data, draw conclusions and apply new knowledge while tackling hands-on challenges.

Recently, I had the chance to visit Camp Invention at Hyattsville Elementary in Maryland. I was impressed by how the students came up with new product ideas and built original prototypes using real tools and components found in everyday devices. But beyond that, they had also thought through how they were going to brand and market an item and how they would protect their innovation by applying for a patent and trademark. I was inspired by their enthusiasm and inventive thinking.

Camp Invention is unique because it provides an exciting environment with no wrong answers, a chance to brainstorm with peers and an opportunity to build confidence in the natural ability to dream and create. On a given day, students might learn about such things as terraforming exoplanets, building an air cannon, exploring circuits and electronics or presenting their new invention to mock investors.  

Each year, one Camp Invention student is selected through the “Mighty Minds” contest for an all-expense paid trip to attend the National Inventors Hall of Fame Induction Ceremony in Washington, DC.  This year, the winner was 9-year-old Mya Sewell of Grayson, GA, who has attended Camp Invention for several years. She says she wants to be a scientist or inventor because, “it gives me the freedom to experiment with things without anybody telling me what to do.” Learn more about her experience interacting with prominent inventors at next year’s induction ceremony on May 4, 2018.

In addition to Camp Invention, the USPTO also works with the National Inventors Hall of Fame on the Collegiate Inventors Competition, a program designed to allow undergraduate and graduate students to showcase their emerging ideas and inventions that will shape our future. The finalists are judged by a team of inductees from the National Inventors Hall of Fame and USPTO subject-matter experts, and then honored at the USPTO. Winners enjoy over $100,000 in cash prizes and an all-expense paid trip to Washington, DC.

Through the USPTO’s partnerships with youth programs, such as Camp Invention and Collegiate Inventors, we hope to inspire future innovators and encourage creativity and problem-solving skills to enable the next generation to achieve the American Dream.

          3D printing – A New Industry Made in America        
Image of Additive Manufacturing Partnership meeting held at the United States Patent and Trademark Office (USPTO).

Increasingly, we’re seeing the products of additive manufacturing – better known as 3D printing – all around us: in retail stores, in classrooms, and even in medical technologies.  

The U.S. Patent and Trademark Office (USPTO) received over 8,000 patent applications last year alone in the field of additive material technologies. These represent a range of products – from household items to prosthetics – that are being manufactured with 3D printing and are having a positive impact on people’s lives and the economy.

One of the founding minds in 3D printing is National Inventors Hall of Fame inductee Charles Hull. Troubled how long it could take to create a prototype of a new device or tool, he created stereolithography in the 1980s, the first commercial rapid prototyping technology, now known as 3D printing.  In recent years, the growth and popularity of 3D printers has skyrocketed, as they are increasingly being used by small businesses, hobbyists and entrepreneurs because of their speed and accuracy. There is now even a 3D printer on the International Space Station.

Exciting advances are being made with 3D bioprinting, a method of using 3D printing to create new tissues and organs. The USPTO works with the National Inventors Hall of Fame in running the annual Collegiate Inventors Competition, which has showcased the next generation of 3D printing innovation, such as previous graduate school winner Dave Kolesky for 3D bioprinting of vascularized human tissue. Learn more about 3D bioprinting in the USPTO’s Science of Innovation video, produced by NBC Learn.

The USPTO plays an important role in supporting American businesses in new and growing industries to get new products and technologies to the marketplace faster. This ultimately drives innovation and creates new jobs for American workers, benefitting consumers and manufacturers alike.

Lastly, to stay ahead of the curve in new areas, the agency partners with private industry in other areas such as cyber security and bioscience, all while providing the most up-to-date technical training to patent examiners who examine these new technologies every day.  

          Chapter 10        
Pointers to Functions
Up to this point we have been discussing pointers to data objects. C also
permits the declaration of pointers to functions. Pointers to functions have a
variety of uses and some of them will be discussed here.
Consider the following real problem. You want to write a function that is
capable of sorting virtually any collection of data that can be stored in an
array. This might be an array of strings, or integers, or floats, or even
structures. The sorting algorithm can be the same for all. For example, it could
be a simple bubble sort algorithm, or the more complex shell or quick sort
algorithm. We'll use a simple bubble sort for demonstration purposes.
Sedgewick [1] has described the bubble sort using C code by setting up a
function which when passed a pointer to the array would sort it. If we call that
function bubble(), a sort program is described by bubble_1.c, which follows:
/*-------------------- bubble_1.c --------------------*/

/* Program bubble_1.c from PTRTUT10.HTM 6/13/97 */


int arr[10] = { 3,6,1,2,3,8,4,1,7,2};

void bubble(int a[], int N);

int main(void)
int i;
for (i = 0; i < 10; i++)
printf("%d ", arr[i]);

for (i = 0; i < 10; i++)
printf("%d ", arr[i]);
return 0;

void bubble(int a[], int N)
int i, j, t;
for (i = N-1; i >= 0; i--)
for (j = 1; j <= i; j++)
if (a[j-1] > a[j])
t = a[j-1];
a[j-1] = a[j];
a[j] = t;

/*---------------------- end bubble_1.c -----------------------*/

The bubble sort is one of the simpler sorts. The algorithm scans the array from
the second to the last element comparing each element with the one which
precedes it. If the one that precedes it is larger than the current element, the
two are swapped so the larger one is closer to the end of the array. On the
first pass, this results in the largest element ending up at the end of the
array. The array is now limited to all elements except the last and the process
repeated. This puts the next largest element at a point preceding the largest
element. The process is repeated for a number of times equal to the number of
elements minus 1. The end result is a sorted array.
Here our function is designed to sort an array of integers. Thus in line 1 we
are comparing integers and in lines 2 through 4 we are using temporary integer
storage to store integers. What we want to do now is see if we can convert this
code so we can use any data type, i.e. not be restricted to integers.
At the same time we don't want to have to analyze our algorithm and the code
associated with it each time we use it. We start by removing the comparison from
within the function bubble() so as to make it relatively easy to modify the
comparison function without having to re-write portions related to the actual
algorithm. This results in bubble_2.c:
/*---------------------- bubble_2.c -------------------------*/

/* Program bubble_2.c from PTRTUT10.HTM 6/13/97 */

/* Separating the comparison function */


int arr[10] = { 3,6,1,2,3,8,4,1,7,2};

void bubble(int a[], int N);
int compare(int m, int n);

int main(void)
int i;
for (i = 0; i < 10; i++)
printf("%d ", arr[i]);

for (i = 0; i < 10; i++)
printf("%d ", arr[i]);
return 0;

void bubble(int a[], int N)

int i, j, t;
for (i = N-1; i >= 0; i--)
for (j = 1; j <= i; j++)
if (compare(a[j-1], a[j]))
t = a[j-1];
a[j-1] = a[j];
a[j] = t;

int compare(int m, int n)
return (m > n);
/*--------------------- end of bubble_2.c -----------------------*/

If our goal is to make our sort routine data type independent, one way of doing
this is to use pointers to type void to point to the data instead of using the
integer data type. As a start in that direction let's modify a few things in the
above so that pointers can be used. To begin with, we'll stick with pointers to
type integer.
/*----------------------- bubble_3.c -------------------------*/

/* Program bubble_3.c from PTRTUT10.HTM 6/13/97 */


int arr[10] = { 3,6,1,2,3,8,4,1,7,2};

void bubble(int *p, int N);
int compare(int *m, int *n);

int main(void)
int i;

for (i = 0; i < 10; i++)
printf("%d ", arr[i]);

for (i = 0; i < 10; i++)
printf("%d ", arr[i]);
return 0;

void bubble(int *p, int N)
int i, j, t;
for (i = N-1; i >= 0; i--)
for (j = 1; j <= i; j++)
if (compare(&p[j-1], &p[j]))
t = p[j-1];
p[j-1] = p[j];
p[j] = t;

int compare(int *m, int *n)
return (*m > *n);

/*------------------ end of bubble3.c -------------------------*/

Note the changes. We are now passing a pointer to an integer (or array of
integers) to bubble(). And from within bubble we are passing pointers to the
elements of the array that we want to compare to our comparison function. And,
of course we are dereferencing these pointer in our compare() function in order
to make the actual comparison. Our next step will be to convert the pointers in
bubble() to pointers to type void so that that function will become more type
insensitive. This is shown in bubble_4.
/*------------------ bubble_4.c ----------------------------*/

/* Program bubble_4.c from PTRTUT10,HTM 6/13/97 */


int arr[10] = { 3,6,1,2,3,8,4,1,7,2};

void bubble(int *p, int N);
int compare(void *m, void *n);

int main(void)
int i;

for (i = 0; i < 10; i++)
printf("%d ", arr[i]);

for (i = 0; i < 10; i++)
printf("%d ", arr[i]);
return 0;

void bubble(int *p, int N)
int i, j, t;
for (i = N-1; i >= 0; i--)
for (j = 1; j <= i; j++)
if (compare((void *)&p[j-1], (void *)&p[j]))
t = p[j-1];
p[j-1] = p[j];
p[j] = t;

int compare(void *m, void *n)
int *m1, *n1;
m1 = (int *)m;
n1 = (int *)n;
return (*m1 > *n1);

/*------------------ end of bubble_4.c ---------------------*/

Note that, in doing this, in compare() we had to introduce the casting of the
void pointer types passed to the actual type being sorted. But, as we'll see
later that's okay. And since what is being passed to bubble() is still a pointer
to an array of integers, we had to cast these pointers to void pointers when we
passed them as parameters in our call to compare().
We now address the problem of what we pass to bubble(). We want to make the
first parameter of that function a void pointer also. But, that means that
within bubble() we need to do something about the variable t, which is currently
an integer. Also, where we use t = p[j-1]; the type of p[j-1] needs to be known
in order to know how many bytes to copy to the variable t (or whatever we
replace t with).
Currently, in bubble_4.c, knowledge within bubble() as to the type of the data
being sorted (and hence the size of each individual element) is obtained from
the fact that the first parameter is a pointer to type integer. If we are going
to be able to use bubble() to sort any type of data, we need to make that
pointer a pointer to type void. But, in doing so we are going to lose
information concerning the size of individual elements within the array. So, in
bubble_5.c we will add a separate parameter to handle this size information.
These changes, from bubble4.c to bubble5.c are, perhaps, a bit more extensive
than those we have made in the past. So, compare the two modules carefully for
/*---------------------- bubble5.c ---------------------------*/

/* Program bubble_5.c from PTRTUT10.HTM 6/13/97 */


long arr[10] = { 3,6,1,2,3,8,4,1,7,2};

void bubble(void *p, size_t width, int N);
int compare(void *m, void *n);

int main(void)
int i;

for (i = 0; i < 10; i++)
printf("%d ", arr[i]);
bubble(arr, sizeof(long), 10);

for (i = 0; i < 10; i++)
printf("%ld ", arr[i]);

return 0;

void bubble(void *p, size_t width, int N)
int i, j;
unsigned char buf[4];
unsigned char *bp = p;

for (i = N-1; i >= 0; i--)
for (j = 1; j <= i; j++)
if (compare((void *)(bp + width*(j-1)),
(void *)(bp + j*width))) /* 1 */
/* t = p[j-1]; */
memcpy(buf, bp + width*(j-1), width);
/* p[j-1] = p[j]; */
memcpy(bp + width*(j-1), bp + j*width , width);
/* p[j] = t; */
memcpy(bp + j*width, buf, width);

int compare(void *m, void *n)
long *m1, *n1;
m1 = (long *)m;
n1 = (long *)n;
return (*m1 > *n1);

/*--------------------- end of bubble5.c ---------------------*/

Note that I have changed the data type of the array from int to long to
illustrate the changes needed in the compare() function. Within bubble() I've
done away with the variable t (which we would have had to change from type int
to type long). I have added a buffer of size 4 unsigned characters, which is the
size needed to hold a long (this will change again in future modifications to
this code). The unsigned character pointer *bp is used to point to the base of
the array to be sorted, i.e. to the first element of that array.
We also had to modify what we passed to compare(), and how we do the swapping of
elements that the comparison indicates need swapping. Use of memcpy() and
pointer notation instead of array notation work towards this reduction in type
Again, making a careful comparison of bubble5.c with bubble4.c can result in
improved understanding of what is happening and why.
We move now to bubble6.c where we use the same function bubble() that we used in
bubble5.c to sort strings instead of long integers. Of course we have to change
the comparison function since the means by which strings are compared is
different from that by which long integers are compared. And,in bubble6.c we
have deleted the lines within bubble() that were commented out in bubble5.c.
/*--------------------- bubble6.c ---------------------*/
/* Program bubble_6.c from PTRTUT10.HTM 6/13/97 */


#define MAX_BUF 256

char arr2[5][20] = { "Mickey Mouse",

"Donald Duck",

"Minnie Mouse",


"Ted Jensen" };

void bubble(void *p, int width, int N);
int compare(void *m, void *n);

int main(void)
int i;

for (i = 0; i < 5; i++)
printf("%s\n", arr2[i]);
bubble(arr2, 20, 5);

for (i = 0; i < 5; i++)
printf("%s\n", arr2[i]);
return 0;

void bubble(void *p, int width, int N)
int i, j, k;
unsigned char buf[MAX_BUF];
unsigned char *bp = p;

for (i = N-1; i >= 0; i--)
for (j = 1; j <= i; j++)
k = compare((void *)(bp + width*(j-1)), (void *)(bp + j*width));
if (k > 0)
memcpy(buf, bp + width*(j-1), width);
memcpy(bp + width*(j-1), bp + j*width , width);
memcpy(bp + j*width, buf, width);

int compare(void *m, void *n)
char *m1 = m;
char *n1 = n;
return (strcmp(m1,n1));

/*------------------- end of bubble6.c ---------------------*/

But, the fact that bubble() was unchanged from that used in bubble5.c indicates
that that function is capable of sorting a wide variety of data types. What is
left to do is to pass to bubble() the name of the comparison function we want to
use so that it can be truly universal. Just as the name of an array is the
address of the first element of the array in the data segment, the name of a
function decays into the address of that function in the code segment. Thus we
need to use a pointer to a function. In this case the comparison function.
Pointers to functions must match the functions pointed to in the number and
types of the parameters and the type of the return value. In our case, we
declare our function pointer as:
int (*fptr)(const void *p1, const void *p2);

Note that were we to write:
int *fptr(const void *p1, const void *p2);

we would have a function prototype for a function which returned a pointer to
type int. That is because in C the parenthesis () operator have a higher
precedence than the pointer * operator. By putting the parenthesis around the
string (*fptr) we indicate that we are declaring a function pointer.
We now modify our declaration of bubble() by adding, as its 4th parameter, a
function pointer of the proper type. It's function prototype becomes:
void bubble(void *p, int width, int N,
int(*fptr)(const void *, const void *));

When we call the bubble(), we insert the name of the comparison function that we
want to use. bubble7.c illustrate how this approach permits the use of the same
bubble() function for sorting different types of data.
/*------------------- bubble7.c ------------------*/

/* Program bubble_7.c from PTRTUT10.HTM 6/10/97 */


#define MAX_BUF 256

long arr[10] = { 3,6,1,2,3,8,4,1,7,2};
char arr2[5][20] = { "Mickey Mouse",
"Donald Duck",
"Minnie Mouse",
"Ted Jensen" };

void bubble(void *p, int width, int N,
int(*fptr)(const void *, const void *));
int compare_string(const void *m, const void *n);
int compare_long(const void *m, const void *n);

int main(void)
int i;
puts("\nBefore Sorting:\n");

for (i = 0; i < 10; i++) /* show the long ints */
printf("%ld ",arr[i]);

for (i = 0; i < 5; i++) /* show the strings */
printf("%s\n", arr2[i]);
bubble(arr, 4, 10, compare_long); /* sort the longs */
bubble(arr2, 20, 5, compare_string); /* sort the strings */
puts("\n\nAfter Sorting:\n");

for (i = 0; i < 10; i++) /* show the sorted longs */
printf("%d ",arr[i]);

for (i = 0; i < 5; i++) /* show the sorted strings */
printf("%s\n", arr2[i]);
return 0;

void bubble(void *p, int width, int N,
int(*fptr)(const void *, const void *))
int i, j, k;
unsigned char buf[MAX_BUF];
unsigned char *bp = p;

for (i = N-1; i >= 0; i--)
for (j = 1; j <= i; j++)
k = fptr((void *)(bp + width*(j-1)), (void *)(bp + j*width));
if (k > 0)
memcpy(buf, bp + width*(j-1), width);
memcpy(bp + width*(j-1), bp + j*width , width);
memcpy(bp + j*width, buf, width);

int compare_string(const void *m, const void *n)
char *m1 = (char *)m;
char *n1 = (char *)n;
return (strcmp(m1,n1));

int compare_long(const void *m, const void *n)
long *m1, *n1;
m1 = (long *)m;
n1 = (long *)n;
return (*m1 > *n1);
          Chapter 8        
Pointers to Arrays
Pointers, of course, can be "pointed at" any type of data object, including
arrays. While that was evident when we discussed program 3.1, it is important to
expand on how we do this when it comes to multi-dimensional arrays.
To review, in Chapter 2 we stated that given an array of integers we could point
an integer pointer at that array using:
int *ptr;
ptr = &my_array[0]; /* point our pointer at the first
integer in our array */

As we stated there, the type of the pointer variable must match the type of the
first element of the array.
In addition, we can use a pointer as a formal parameter of a function which is
designed to manipulate an array. e.g.
int array[3] = {'1', '5', '7'};
void a_func(int *p);

Some programmers might prefer to write the function prototype as:
void a_func(int p[]);

which would tend to inform others who might use this function that the function
is designed to manipulate the elements of an array. Of course, in either case,
what actually gets passed is the value of a pointer to the first element of the
array, independent of which notation is used in the function prototype or
definition. Note that if the array notation is used, there is no need to pass
the actual dimension of the array since we are not passing the whole array, only
the address to the first element.
We now turn to the problem of the 2 dimensional array. As stated in the last
chapter, C interprets a 2 dimensional array as an array of one dimensional
arrays. That being the case, the first element of a 2 dimensional array of
integers is a one dimensional array of integers. And a pointer to a two
dimensional array of integers must be a pointer to that data type. One way of
accomplishing this is through the use of the keyword "typedef". typedef assigns
a new name to a specified data type. For example:
typedef unsigned char byte;

causes the name byte to mean type unsigned char. Hence
byte b[10]; would be an array of unsigned characters.

Note that in the typedef declaration, the word byte has replaced that which
would normally be the name of our unsigned char. That is, the rule for using
typedef is that the new name for the data type is the name used in the
definition of the data type. Thus in:
typedef int Array[10];

Array becomes a data type for an array of 10 integers. i.e. Array my_arr;
declares my_arr as an array of 10 integers and Array arr2d[5]; makes arr2d an
array of 5 arrays of 10 integers each.
Also note that Array *p1d; makes p1d a pointer to an array of 10 integers.
Because *p1d points to the same type as arr2d, assigning the address of the two
dimensional array arr2d to p1d, the pointer to a one dimensional array of 10
integers is acceptable. i.e. p1d = &arr2d[0]; or p1d = arr2d; are both correct.
Since the data type we use for our pointer is an array of 10 integers we would
expect that incrementing p1d by 1 would change its value by 10*sizeof(int),
which it does. That is, sizeof(*p1d) is 20. You can prove this to yourself by
writing and running a simple short program.
Now, while using typedef makes things clearer for the reader and easier on the
programmer, it is not really necessary. What we need is a way of declaring a
pointer like p1d without the need of the typedef keyword. It turns out that this
can be done and that
int (*p1d)[10];

is the proper declaration, i.e. p1d here is a pointer to an array of 10 integers
just as it was under the declaration using the Array type. Note that this is
different from
int *p1d[10];

which would make p1d the name of an array of 10 pointers to type int.
Continue with Pointer Tutorial
Back to Table of Contents
          Chapter 5        
Pointers and Structures
As you may know, we can declare the form of a block of data containing different
data types by means of a structure declaration. For example, a personnel file
might contain structures which look something like:
struct tag {
char lname[20]; /* last name */
char fname[20]; /* first name */
int age; /* age */
float rate; /* e.g. 12.75 per hour */

Let's say we have a bunch of these structures in a disk file and we want to read
each one out and print out the first and last name of each one so that we can
have a list of the people in our files. The remaining information will not be
printed out. We will want to do this printing with a function call and pass to
that function a pointer to the structure at hand. For demonstration purposes I
will use only one structure for now. But realize the goal is the writing of the
function, not the reading of the file which, presumably, we know how to do.
For review, recall that we can access structure members with the dot operator as
--------------- program 5.1 ------------------

/* Program 5.1 from PTRTUT10.HTM 6/13/97 */


struct tag {
char lname[20]; /* last name */
char fname[20]; /* first name */
int age; /* age */
float rate; /* e.g. 12.75 per hour */

struct tag my_struct; /* declare the structure my_struct */

int main(void)
printf("\n%s ",my_struct.fname);
return 0;

-------------- end of program 5.1 --------------

Now, this particular structure is rather small compared to many used in C
programs. To the above we might want to add:
date_of_hire; (data types not shown)

If we have a large number of employees, what we want to do is manipulate the
data in these structures by means of functions. For example we might want a
function print out the name of the employee listed in any structure passed to
it. However, in the original C (Kernighan & Ritchie, 1st Edition) it was not
possible to pass a structure, only a pointer to a structure could be passed. In
ANSI C, it is now permissible to pass the complete structure. But, since our
goal here is to learn more about pointers, we won't pursue that.
Anyway, if we pass the whole structure it means that we must copy the contents
of the structure from the calling function to the called function. In systems
using stacks, this is done by pushing the contents of the structure on the
stack. With large structures this could prove to be a problem. However, passing
a pointer uses a minimum amount of stack space.
In any case, since this is a discussion of pointers, we will discuss how we go
about passing a pointer to a structure and then using it within the function.
Consider the case described, i.e. we want a function that will accept as a
parameter a pointer to a structure and from within that function we want to
access members of the structure. For example we want to print out the name of
the employee in our example structure.
Okay, so we know that our pointer is going to point to a structure declared
using struct tag. We declare such a pointer with the declaration:
struct tag *st_ptr;

and we point it to our example structure with:
st_ptr = &my_struct;

Now, we can access a given member by de-referencing the pointer. But, how do we
de-reference the pointer to a structure? Well, consider the fact that we might
want to use the pointer to set the age of the employee. We would write:
(*st_ptr).age = 63;

Look at this carefully. It says, replace that within the parenthesis with that
which st_ptr points to, which is the structure my_struct. Thus, this breaks down
to the same as my_struct.age.
However, this is a fairly often used expression and the designers of C have
created an alternate syntax with the same meaning which is:
st_ptr->age = 63;

With that in mind, look at the following program:
------------ program 5.2 ---------------------

/* Program 5.2 from PTRTUT10.HTM 6/13/97 */


struct tag{ /* the structure type */
char lname[20]; /* last name */
char fname[20]; /* first name */
int age; /* age */
float rate; /* e.g. 12.75 per hour */

struct tag my_struct; /* define the structure */
void show_name(struct tag *p); /* function prototype */

int main(void)
struct tag *st_ptr; /* a pointer to a structure */
st_ptr = &my_struct; /* point the pointer to my_struct */
printf("\n%s ",my_struct.fname);
my_struct.age = 63;
show_name(st_ptr); /* pass the pointer */
return 0;

void show_name(struct tag *p)
printf("\n%s ", p->fname); /* p points to a structure */
printf("%s ", p->lname);
printf("%d\n", p->age);

-------------------- end of program 5.2 ----------------

Again, this is a lot of information to absorb at one time. The reader should
compile and run the various code snippets and using a debugger monitor things
like my_struct and p while single stepping through the main and following the
code down into the function to see what is happening.

Continue with Pointer Tutorial
Back to Table of Contents
          CHAPTER 3        
Pointers and Strings
The study of strings is useful to further tie in the relationship between
pointers and arrays. It also makes it easy to illustrate how some of the
standard C string functions can be implemented. Finally it illustrates how and
when pointers can and should be passed to functions.
In C, strings are arrays of characters. This is not necessarily true in other
languages. In BASIC, Pascal, Fortran and various other languages, a string has
its own data type. But in C it does not. In C a string is an array of characters
terminated with a binary zero character (written as '\0'). To start off our
discussion we will write some code which, while preferred for illustrative
purposes, you would probably never write in an actual program. Consider, for
char my_string[40];

my_string[0] = 'T';
my_string[1] = 'e';
my_string[2] = 'd':
my_string[3] = '\0';

While one would never build a string like this, the end result is a string in
that it is an array of characters terminated with a nul character. By
definition, in C, a string is an array of characters terminated with the nul
character. Be aware that "nul" is not the same as "NULL". The nul refers to a
zero as defined by the escape sequence '\0'. That is it occupies one byte of
memory. NULL, on the other hand, is the name of the macro used to initialize
null pointers. NULL is #defined in a header file in your C compiler, nul may not
be #defined at all.
Since writing the above code would be very time consuming, C permits two
alternate ways of achieving the same thing. First, one might write:
char my_string[40] = {'T', 'e', 'd', '\0',};

But this also takes more typing than is convenient. So, C permits:
char my_string[40] = "Ted";

When the double quotes are used, instead of the single quotes as was done in the
previous examples, the nul character ( '\0' ) is automatically appended to the
end of the string.
In all of the above cases, the same thing happens. The compiler sets aside an
contiguous block of memory 40 bytes long to hold characters and initialized it
such that the first 4 characters are Ted\0.
Now, consider the following program:
------------------program 3.1-------------------------------------

/* Program 3.1 from PTRTUT10.HTM 6/13/97 */


char strA[80] = "A string to be used for demonstration purposes";
char strB[80];

int main(void)

char *pA; /* a pointer to type character */
char *pB; /* another pointer to type character */
puts(strA); /* show string A */
pA = strA; /* point pA at string A */
puts(pA); /* show what pA is pointing to */
pB = strB; /* point pB at string B */
putchar('\n'); /* move down one line on the screen */
while(*pA != '\0') /* line A (see text) */
*pB++ = *pA++; /* line B (see text) */
*pB = '\0'; /* line C (see text) */
puts(strB); /* show strB on screen */
return 0;

--------- end program 3.1 -------------------------------------

In the above we start out by defining two character arrays of 80 characters
each. Since these are globally defined, they are initialized to all '\0's first.
Then, strA has the first 42 characters initialized to the string in quotes.
Now, moving into the code, we declare two character pointers and show the string
on the screen. We then "point" the pointer pA at strA. That is, by means of the
assignment statement we copy the address of strA[0] into our variable pA. We now
use puts() to show that which is pointed to by pA on the screen. Consider here
that the function prototype for puts() is:
int puts(const char *s);

For the moment, ignore the const. The parameter passed to puts() is a pointer,
that is the value of a pointer (since all parameters in C are passed by value),
and the value of a pointer is the address to which it points, or, simply, an
address. Thus when we write puts(strA); as we have seen, we are passing the
address of strA[0].
Similarly, when we write puts(pA); we are passing the same address, since we
have set pA = strA;
Given that, follow the code down to the while() statement on line A. Line A
While the character pointed to by pA (i.e. *pA) is not a nul character (i.e. the
terminating '\0'), do the following:
Line B states: copy the character pointed to by pA to the space pointed to by
pB, then increment pA so it points to the next character and pB so it points to
the next space.
When we have copied the last character, pA now points to the terminating nul
character and the loop ends. However, we have not copied the nul character. And,
by definition a string in C must be nul terminated. So, we add the nul character
with line C.
It is very educational to run this program with your debugger while watching
strA, strB, pA and pB and single stepping through the program. It is even more
educational if instead of simply defining strB[] as has been done above,
initialize it also with something like:
strB[80] = "12345678901234567890123456789012345678901234567890"

where the number of digits used is greater than the length of strA and then
repeat the single stepping procedure while watching the above variables. Give
these things a try!
Getting back to the prototype for puts() for a moment, the "const" used as a
parameter modifier informs the user that the function will not modify the string
pointed to by s, i.e. it will treat that string as a constant.
Of course, what the above program illustrates is a simple way of copying a
string. After playing with the above until you have a good understanding of what
is happening, we can proceed to creating our own replacement for the standard
strcpy() that comes with C. It might look like:
char *my_strcpy(char *destination, char *source)
char *p = destination;
while (*source != '\0')
*p++ = *source++;
*p = '\0';
return destination;

In this case, I have followed the practice used in the standard routine of
returning a pointer to the destination.
Again, the function is designed to accept the values of two character pointers,
i.e. addresses, and thus in the previous program we could write:
int main(void)
my_strcpy(strB, strA);

I have deviated slightly from the form used in standard C which would have the
char *my_strcpy(char *destination, const char *source);

Here the "const" modifier is used to assure the user that the function will not
modify the contents pointed to by the source pointer. You can prove this by
modifying the function above, and its prototype, to include the "const" modifier
as shown. Then, within the function you can add a statement which attempts to
change the contents of that which is pointed to by source, such as:
*source = 'X';

which would normally change the first character of the string to an X. The const
modifier should cause your compiler to catch this as an error. Try it and see.
Now, let's consider some of the things the above examples have shown us. First
off, consider the fact that *ptr++ is to be interpreted as returning the value
pointed to by ptr and then incrementing the pointer value. This has to do with
the precedence of the operators. Were we to write (*ptr)++ we would increment,
not the pointer, but that which the pointer points to! i.e. if used on the first
character of the above example string the 'T' would be incremented to a 'U'. You
can write some simple example code to illustrate this.
Recall again that a string is nothing more than an array of characters, with the
last character being a '\0'. What we have done above is deal with copying an
array. It happens to be an array of characters but the technique could be
applied to an array of integers, doubles, etc. In those cases, however, we would
not be dealing with strings and hence the end of the array would not be marked
with a special value like the nul character. We could implement a version that
relied on a special value to identify the end. For example, we could copy an
array of positive integers by marking the end with a negative integer. On the
other hand, it is more usual that when we write a function to copy an array of
items other than strings we pass the function the number of items to be copied
as well as the address of the array, e.g. something like the following prototype
might indicate:
void int_copy(int *ptrA, int *ptrB, int nbr);

where nbr is the number of integers to be copied. You might want to play with
this idea and create an array of integers and see if you can write the function
int_copy() and make it work.
This permits using functions to manipulate large arrays. For example, if we have
an array of 5000 integers that we want to manipulate with a function, we need
only pass to that function the address of the array (and any auxiliary
information such as nbr above, depending on what we are doing). The array itself
does not get passed, i.e. the whole array is not copied and put on the stack
before calling the function, only its address is sent.
This is different from passing, say an integer, to a function. When we pass an
integer we make a copy of the integer, i.e. get its value and put it on the
stack. Within the function any manipulation of the value passed can in no way
effect the original integer. But, with arrays and pointers we can pass the
address of the variable and hence manipulate the values of the original
Continue with Pointer Tutorial
Back to Table of Contents
          Three Keys to Advancing your Digital Transformation        

Digital assets

With today’s proliferation of data, digital transformation (DX) has become more than a hot topic: It’s an imperative for businesses of all shapes and sizes. The collision of data, analytics and technology has businesses, analysts and consumers excited — and scared — about what could happen next.

On one hand, everyone from banks to bagel shops and travel sites to tractor manufacturers have found new ways to connect the dots in their businesses while forging stronger, more dynamic customer engagement. Artificial intelligence (AI) has come of age in technologies such as smart sensors, robotic arms, and devices that can turn lights and heat on and off, adjust for changes in conditions and preferences, and even automatically reorder food and supplies for us.

However, today's Chief Analytics Officer (and Chief Data Officer and Chief Digital Officer, for example) faces both the promise and precariousness of digitizing business. While significant opportunities abound to drive revenues and customer connectivity, any leader will freely confess there are myriad technological, business and human obstacles to transforming even one element of business, introducing a new unique product or even meeting regulatory requirements.

The Big Data Dilemma

Big Data is at once the promise of the DX and its biggest roadblock. A recent Harvard Business Review article put it succinctly: “Businesses today are constantly generating enormous amounts of data, but that doesn’t always translate to actionable information.”

When 150 data scientists were asked if they had built a machine learning model, roughly one-third raised their hands. How many had deployed and/or used this model to generate value, and evaluated it? Not a single one.

This doesn’t invalidate the role of Big Data in achieving DX. To the contrary: The key to leveraging Big Data is understanding what its role is in solving your business problems, and then building strategies to make that happen — understanding, of course, that there will be missteps and possibly complete meltdowns along the way.

In fact, Big Data is just one component of DX that you need to think about. Your technology infrastructure and investments (including packaged applications, databases, and analytic and BI tools) need to similarly be rationalized and ultimately monetized, to deliver the true value they can bring to DX.

Odds are many components will either be retired or repurposed, and you’ll likely come to the same conclusion as everyone else that your business users are going to be key players in how DX technology solutions get built and used. That means your technology and analytic tools need to allow you the agility and flexibility to prototype and deploy quickly; evolve at the speed of business; and empower people across functions and lines of business to collaborate more than they’ve ever done before.

Beyond mapping out your overarching data, technology and analytic strategies, there are several areas to consider on your DX journey. Over the next three posts, I’ll focus on how to:

  1. Visualize your digital business, not your competitors’
  2. Unleash the knowledge hidden within your most critical assets
  3. Embrace the role and evolution of analytics within your journey

To whet your appetite, check out this short video on the role of AI in making DX-powered decisions.


The post Three Keys to Advancing your Digital Transformation appeared first on FICO.

          The Bi-Valve Audio Enhancer        

Courtesy of MAKE's Flickr pool, the Bi-Valve Audio Enhancer, a simple but fun looking DAP enclosure that leverages a cheap portable iPod speaker to look like some sort of antediluvian sound machine prototype. A MAKE commenter christens the style "hot …

          Range comprehensions with C++ lazy generators        
In the previous post we had a look at a recent proposal N4286 to add stackless coroutines to the C++ language and described the implementation prototype that ships with Visual Studio 2015 CTP. We saw that coroutines can be used to implement lazy sequences. Lazy evaluation is a powerful tool and a pillar of functional […]
          Next low cost iPhone? - 9 to 5 Mac        

MacRumors is reporting a plastic white iPhone found on may be the next value iPhone.

We only have 3 issues with these pictures:

1. It is running Cydia which is unlikely to be an easy install on a new prototype iPhone with a new cellular chip, internal hardware and firmware.  It is doubtful Apple is testing Cydia and it isn't a cakewalk to jailbreak an entirely new device.
2. Probably most damning, it has the tell-tale old white speaker grill at the top which was phased out as the white iPhone was improved over the past year and a half. It looks like the poster may have tried to blur it as well. It is doubtful that Apple would go back to this flawed design.
3. Plastic?

Some other information that you want to transfer to you is that this machine seems to run faster than the iPhone 4, lighter weight and two glass front and back seem to have been replaced by two plastic sheets, type of sensation and cry other than the iPhone 4.

It is unlikely that Apple would ever replace the glass front with plastic.  Even the $ 229 iPod (ipod touch firmware 3.1.3 download) touch is glass.  Can you imagine Apple doing plastic face?  Not possible.

More likely there is a matte overlay on the front of it.

Otherwise, this phone has a plastic back which would decrease the weight and cost and increase the durability.  Which actually makes some sense – though the $ 229 iPod (ipod touch firmware 3.1.3 download) touch gets by with stainless steel.  We were throwing this device around yesterday and concluded it was probably an iPhone 4 prototype that was brought back up to the surface – there is no shortage of those in Asia.  Two more shots below.


40.714513 -74.005122

ipod touch firmware – Google News

          Make: Iris Skylight        
Make: Iris Skylight

Interested in making your own Mechanical Iris as seen in Make: Magazine's recent skylight project? We've partnered with Make: to bring you this project, and as we do with our other design & make projects, we have made the Fusion 360 design files available here.  This is an advanced fabrication project with a number of potential uses and ample opportunity for customization.  


In working with Caleb on this project, we produced two variations of this "pivot and slot" overlapping-leaf Iris design.  The first was made as a scale model prototype, called the 10" Benchtop Iris and the second is a full-size 26" Skylight Iris (the same as featured in the Made: Skylight project). Designed in Fusion 360, these designs include both the geometry, hardware, and CAM toolpaths that we used. Caution should be used when applying the toolpaths, while they worked for our fixturing and tooling - you should verify these before running on your machine.  

10 inch Iris Render  26 inch iris rendering

Assembly View

Both versions rely on the same basic architecture, a top cover that houses the motor and bearing ring, a lower fixed plate that provides the slots for the leaves to move in, a moving plate that the leaves pivot on, the bearing ring retains the moving plate to the top cover (allowing it to rotate but otherwise stay centered).   There are 12 iris leaves that attach to the moving plate pivot holes and the slots in the fixed plate. Smaller components like the drive gear (connects to the motor) and hardware to attach the top cover to the lower fixed plate (with spacers) finish the assembly.

Key Center on the CNC Machine

Note: The versions here use an NEMA 23 stepper motor, which requires electronics and programming know-how and the following major components.  Alternatively, a crank handle could be substituted for the motor drive with some customization. 

Electronics (advanced)

  • NEMA 23 Stepper Motor (420oz)
  • NEMA 23 Motor Cable
  • CRP5056 Stepper Driver
  • 48v Unregulated Power Supply
  • Arduino / Embedded Controller to generate STEP/DIR signals


Material Cut List

  • Benchtop 10" Iris
    1. (3) 24" x 24" x 1/2" HDPE, Plywood, or MDF (Fixed Plate, Top Cover, Bearing)
    2. (1) 24" x 24" x 3/4" HDPE, Plywood, or MDF (Moving Plate-Gear Ring)
    3. (1) 24" x 36" Plastic or Metal Sheet (Iris leaves)
  • Make: Skylight 26" Iris
    1. (3) 48" x 48" x 1/2" HDPE, Plywood, or MDF (Fixed Plate, Top Cover, Bearing)
    2. (1) 48" x 48" x 3/4" HDPE, Plywood, or MDF (Moving Plate-Gear Ring)
    3. (1) 48" x 96" Plastic or Metal Sheet (Iris leaves)


Source Files  

New in: Design and Make Project Series

image of the maldives chalet in red interior fit out

Chalet prototypes have been studied and researched to finally allow us to let go of the Balinese mainstream aesthetics when we approach an hotel design such as this. The idea was simply to use the lagoon as part of the interior experience, with very open views to the surrounding low density development, making this conceptually powerful, proving it to be a totally honest and straightforward chalet design, and underlining a completely refreshing intent. The product is most challenging as it raises issues of typology and user experience.

          CSU-Kent State Partnership Accelerates Tech Transfer        

New TeCK Fund made possible by Third Frontier Commission grant

CSU-KSU Partnership

An innovative new partnership will soon help accelerate the process of bringing faculty innovations to commercial markets in Northeast Ohio and beyond.

The Ohio Third Frontier Commission has awarded Cleveland State University and Kent State University a grant to develop a joint technology commercialization and startup fund. The program will give university faculty and startup companies access to an $800,000 fund that will assist in accelerating commercialization of university technology and bringing innovative new products to market.

“CSU and Kent State have unique research portfolios that provide significant opportunities for commercialization in a host of fields, from drug development to assistive devices to liquid crystals,” says Jack Kraszewski, Director of the Technology Transfer Office at CSU. “This new fund will accelerate the process for licensing technology while spurring the development of additional technology transfer opportunities with numerous companies across the state.”

“This is about getting intellectual property and innovations out of the laboratory and into the marketplace,” notes Stephen Roberts, Director of Technology Commercialization at Kent State. “Universities often do not have ready access to funding for the prototype creation and market research necessary to create successful spin-out companies. This helps fill those gaps.”

The hybrid accelerator program — to be called the TeCK Fund — will accept applications from faculty teams investigating opportunities to commercialize research and will also assist in connecting university researchers with potential business partners. Funding of up to $100,000 will be available for individual technology validation projects. This is the first time Third Frontier has awarded funds to a joint university collaborative.

“This fund will be a key component of Kent State and CSU’s broader efforts to commercialize more of our inventions and assist in creating jobs and new economic opportunities in Northeast Ohio,” Roberts adds.

The Ohio Third Frontier is a technology-based economic development initiative, and is part of the Ohio Development Services Agency. Third Frontier is committed to transforming the state’s economy through the accelerated growth of diverse startup and early stage technology companies.


          Thoughts for ArtServe Interview        
Computer interface in a shoe box.

Today I had an interview with Jennifer Baum, a writer for ArtServe Michigan. They're doing an article on the Kalamazoo Makers Guild meetup group. In preparation for our discussion Jennifer was kind enough to supply me with some topics we might discuss and I jotted down some notes while I thought about what I would say. Here are those notes and roughly what I said.
About Kalamazoo Makers Guild...

The Kalamazoo Maker's Guild is a group of people interested in DIY technology, science and design. We more or less pattern ourselves after the Homebrew Computer Club that founded Silicon Valley. Like them, our members tend to have some background in a related profession, but that's by no means a prerequisite. This group is about the things we do for fun, because they interest us, and anybody can be interested in making stuff. We meet every couple of months, report on the status of our various projects and sometimes listen to a presentation or hold an ad hoc roundtable on a topic that catches our interest. "Probably the most useful aspect of the group is that you start to feel accountable to the other members of the group and you're motivated to make progress on your project before the next meeting."

How did it get started...

When I gave up my web design business I ended the professional graphic design association I'd formed on, and then I had room on the service to start another group. MAKE magazine had really caught my attention. I did a few projects from the magazine and thought it would be fun and helpful to know other people who were working on the same kinds of things. The group didn't get going, though, until about 8 months ago when Al Hollaway from the       posted to an online forum about RepRap 3D printers at the same time I was building one. He wanted to meet and talk about RepRap. I told him about my Meetup group. We joined forces and here we are. is a great web site because it's a web service that's all about meeting people nearby in person to share a common interest.

About membership and kinds of projects ...

The group is growing steadily now. We have twenty something members and we're seeing membership tick up at an increasing rate month to month. We have a high school student who is working designing assistive devices for the blind using sonic rangefinders, one member who last meeting showed off a prototype of computer interface built into a shoe box, and another member is on the verge of completing a working DIY Segway (the self-balancing scooter) made using a pair of battery-powered drills for motors. Al should be done with his RepRap 3D printer and I've just finished my 2nd. At least two other members are in some stage of building their own 3D printers. I'm building both a laser etcher and a 3D scanner right now, and I'm excited to start playing with the products of a couple Kickstarter projects I've backed. There are a few of us about to start building CNC milling machines, and there's been a lot of excitement in the group around the brand new, hard-to-get Raspberry Pi (a $25 computer.) Almost all the members so far have dabbled in a bit of Arduino hacking. One member is designing a flame thrower for Burning Man. Another is making a calibration device for voltage meters. So, there's a range of things going on.

Where do I see this headed....

Our approach to this group has been to learn from the mistakes other groups have made. All of the other groups I've seen in Kalamazoo start out with facilities and try to bring in members to support and justify it. Getting people to work on actual projects that interest them is something that comes later down the road. It's the, "if you build it they will come" approach. Those groups quickly get into trouble managing the building and funding, and they go away. We're coming at it from the opposite direction. We're gathering together a community of makers first, people who are already doing things on their own. Once we reach a tipping point then we'll worry about the next step, like getting a hackerspace put together. That kind of bottom-up approach is, I think, much more sustainable and durable, and it fits in with our modern culture (particularly in the maker subculture.)  It was good enough for Homebrew, so it's good enough for us.

About impact...

Silicon Valley came out of a group like this, so the potential is there for us to have a big impact on the community. Being a college town we have access to a lot of smart people, and Kalamazoo has a strong progressive, energetic, entrepreneurial vibe going on. I think what's more likely, though, is that we will have an impact in aggregate with all the other makers--groups and individuals--around the globe.

"Makers aren't just hacking new technologies, we're hacking a new economy. We're trying to figure out how to live in a world without scarcity."

The unsung official slogan of the RepRap project is, "wealth without money."

I don't know that another story like Apple is likely to happen again. Steve Jobs relied on a very traditional, very closed model for his business, as did most of the people of that era who went on to make a name for themselves in technology. The ethos of that time was centered around coming up with a big idea and capitalizing on that idea to the exclusion of the competition. It's interesting that even then this view was at odds with that of his partner, Steve Wozniak, who was content to build computers in his garage and share what he learned with his friends at Homebrew. In this way Wozniak was much more like the modern maker/hacker and is probably one of this hobby's forefathers.

Makers/hackers today are all about open-ness and sharing -- not in a hippy, touchy-feely kind of way, but in a calculated way that weighs the costs and benefits of being open verses closed. The success of Linux and the ever increasing number of open source software, and now hardware, projects has proven that there's enormous power in being open. "We tend to think that's the way to change the world."

About the Maker Movement....

I know there are a lot of people who are keen to talk about the "maker movement" but I'm not so sure that I would characterize it as a movement. If it is, then it started in the 60's with people like my dad who were HAM radio enthusiasts and tinkered around with making their own radios and antennas. I think that what we're observing and calling a movement is really an artifact of reaching the steep part of Moore's Law. Ray Kurzweil is famous for talking about this phenomenon. The pace of advances in technology is itself accelerating, it's exponential, and moving so fast now that if you're not paying close attention things seem to pop out of nowhere. For makers, technology has reached a point where Moore's Law has forced down prices and increased the availability of things that just a few years ago were far out of reach. We're just taking those things and running with it. In effect, we're just the people paying close attention.

About me...

I started college in the engineering program at WMU, but I couldn't hack it and dropped out. I went back to community college and got a degree in graphic design. In my professional life I've been paid to be a web designer, photographer, videographer, IT manager, technical document writer, photo lab manager, artist, and I've even been paid to be a poet. For fun I do all those things and also play guitar, peck at a piano, and watch physics and math lectures from the MIT OpenCourseWare web site, do exercises on Khan Academy, play board games and roleplaying games, and commit acts of crafting -- woodworking and model making. For work, I now teach at the Kalamazoo Institute of Arts. I've taught web design, digital illustration and this fall I'll be teaching classes in 3D modeling and 3D printing with the RepRap 3D printer I have on loan there. I live near downtown Kalamazoo with my wife and many pets, including a 23 year old African Grey parrot named KoKo.

Post interview notes...

I mentioned SoliDoodle, the fully assembled, $500 3D printer. The big hackerspace in Detroit is called i3detroit. Also, Chicago has Pumping Station: One. I'm on the forums for both and will be visiting each this summer. The presentation about 3D scanning we had was from Mike Spray of Laser Abilities. You can actually see the entire presentation on my YouTube channel. Thingiverse was the web site that I kept going on about where you can find 3D designs for printing.

          A Pattern of Success        

A host of student-run operations is springing up to sup­port Northeastern startups, offering skills and expertise in fields from accounting to graphic design to software coding. To help these groups forge an alliance, faculty member Dan Gregory made a lead gift last fall, inspiring parents, alumni, and faculty to join in. To date, support for the new initiative—which students named Mosaic—comes to $425,000. The rise of student-led organizations helping students, faculty, and alumni realize their business dreams is a grass-roots phenomenon, Gregory says. It began six years ago with Northeastern’s student-run venture incubator, IDEA, for which he serves as faculty advisor. Inspired by IDEA’s success with peer-to-peer experiential learning, groups at diverse colleges started offering their know-how to IDEA’s ventures. The trend picked up steam in 2013, when students at the College of Arts, Media, and Design formed a design studio, Scout, to create logos and packaging for fledgling ventures. At about the same time, law stu­dents at the IP Clinic—now called the IP CO-LAB—started teaching ventures about intellectual property law. Today, Gregory says, student accountants, prototype manufactur­ers, social enterprise advocates, and other Mosaic members are help­ing ventures thrive. Instead of seeking specialized skills from outside vendors, entrepreneurs can often find them on campus. MEMBERS UNITE A major goal of Mosaic is to break down silos across campus and give ventures access to the multiple disciplines that entrepreneurship requires. Besides providing admin­istrative support, Mosaic funds will cover expenses for which IDEA or one of Northeastern’s colleges might have picked up the tab—from member services to events, pizza-fueled work­shops, and meeting space. To obtain Mosaic funding, member organiza­tions must apply to the Mosaic Coun­cil, whose faculty and administrators set priorities. To understand how Mosaic works, consider its impact on one venture: Wizio. This online platform, a match­making service for Boston realtors and renters, got its start at the Husky Startup Challenge, an Entrepreneurs Club contest that helps students turn ideas into companies. After taking first place and the Audience Choice award in April 2015, Wizio won fund­ing from IDEA to build a prototype. From there, IDEA linked Wizio’s founders to more Mosaic service pro­viders. A Scout team came up with a logo and designed a website. At the School of Law, students with the IP CO-LAB assessed copyright issues, while others at the Community Law Clinic drew up an employee contract. Meanwhile, accounting students who founded D’Amore-McKim’s Account­ing Resource Center, known as ARC, outlined the tax advantages of incor­poration. If Mosaic has a theme, Gregory says, it’s cross-college collaboration: “Students across disciplines are help­ing students, alumni, and faculty launch their ventures.” DOMINOS IN MOTION As soon as Gregory kick-started Mosaic, others came forward. They include law and business professor Susan Montgomery, the IP CO-LAB’s faculty advisor; the Northeastern University Young Global Leaders, an alumni group convened by Presi­dent Joseph E. Aoun; Greg Skloot, DMSB’12, former Entrepreneurs Club president and now, vice president for Growth at Netpulse; and Lea Anne Dunton, PNT, and her husband, Gary, DMSB’78, a Northeastern corporator. In addition to supporting mem­ber groups that are up and running, Mosaic helps new ones get started. Last fall, for example, at the College of Engineering, students formed a hardware-prototyping group they call Generate, housed within the Michael J. and Ann Sherman Center for Engineering Entrepreneurship Education. Beyond his role at IDEA, Gregory co-directs the Northeastern Uni­versity Center for Entrepreneurship Education, Mosaic’s administrative home, helping every college foster entrepreneurship. From this vantage, he calls Mosaic’s peer-to-peer, experiential learning model the “se­cret sauce” behind Northeastern’s thriving entrepreneurial culture, one that “sets us apart from every other university.”        
          Beats by Dr. Dre Studio ~ Review & Giveaway        
I was very excited when I was contacted by Staples to receive a pair of beats by dr. dre studio for review. My husband and I have been looking for a good pair of headphones for a while. We had heard {beats by dr. dre studio} were pretty nice and I couldn't wait to test them out.

While waiting for my headphones to arrive, I got a special lil surprise in the mail from my contact, Jessica. It was a Staples Easy Button. LOL I couldn't help but crack a smile. My life just became easier. YAY! My kids have since been pressing the button and the "that was easy" verse has been a regular sound here in my house. Although they were a little disappointed when it didn't make their homework magically get done! Gotta love that easy button.  Getting a {Lot of Laughs} over here! Thank you, Jessica!

. . . On to my review of these amazing headphones . . .
As soon as the headphones arrived I was impressed. Impressed with the packaging (yes, I loved the box it came in) quality, sound, carrying case and that I could use them for phone calls. I immediately put them to use. First we all (the kids and I) tried them out by listening to some music. The music sounded clear and precise. We all wanted to use them all night. When my husband got home he tried them out too. He said they were awesome and we got a kick out of him singing out loud. He even tried to steal them from me to take on the road. Nope, they are mine! LOL Ok, maybe I will share a little, but not just yet.
Next, I tried them out while making some phone calls. While they don't cancel out all noise (kids playing in the background) they do cancel it out enough so I can actually hear what the other person is saying. With other headphones I can still hear my kids and I have to stop and tell them to quiet down while I talk.  With these I don't have to at all.  I also worry about the other person hearing the kids in the background, but with the beats by dr. dre studio I don't. The microphone does a great job of canceling out the background noise.  I tested this out in the car, where it gets pretty loud sometimes and it passed.  I also want to mention that the transition from music to a phone call is smooth and when you hang up, it goes back to your music with ease.  

The earcups size and cushion make the headphones very comfortable to wear. They are just the right size for children, teens and adults. I do have to say though at the cost of these headphones, I myself plan on only my husband and I using them and possibly supervised use for the kids.  The headphones are durable and I think they would be just fine for kids use, but it's our preference to keep them for adult use most of the time. The headphone case is very impressive with its rigid construction. I didn't even know they came with a case, so I was excited to know I had somewhere to safely keep my headphones and cords all in one place. This case will keep your headphones safe and together when traveling too!
I would definitely recommend the beats by dr. dre studio to family and friends. They would make a perfect gift for anyone any time of the year. Thanks again to Staples and Jessica for allowing me to host this review.
The featured product {s} in this post were provided to me free of charge by the manufacturer or pr company representing the company. All opinions expressed in this review are my own and not influenced in any way by anyone. By entering this giveaway you agree to Trendy Treehouse Disclaimer/Terms of use.

Staples has generously offered to GIVEAWAY a pair of dr. dre studio headphones to one of my lovely readers. Valued up to $180.
(Headphones pictured below are just an example of what you could win. The winner will choose a pair valued up to $180 on
Enter this giveaway below.

The original Beats that took the world by storm

After 25 years, Dr. Dre was tired of spending months on a track only to have his fans hear it on weak, distorting ear buds. Two years and hundreds of prototypes later, Beats Studio headphones are the icons that bring you sound the way it was originally intended.
  • Powered by a pair of AAA batteries, the noise cancellation feature in a pair of stylish Beats Studio headphones amplifies the music as it blocks out noise, giving you consistently powerful and intense sound
  • Super plush and covered with ultra-soft breathable materials, you’ll be cool and comfortable even during marathon listening sessions. The Beats by Dr. Dre logo on the side also acts as a mute button when you press it. No need to take them off to talk.
  • Take calls, skip songs and adjust volume right from the cord of your Beats Studio headphones. No more searching for your phone or music player just to find the right song
  • A flawless and iconic fit. The perfect lightweight, foldable headphone that fits all sizes and shapes comfortably. The result is a design now copied by brands around the world.
  • Beats Studio on ear Headphones, Two (2) AAA batteries, 3.5mm audio cable, 1/4" audio adapter, In-line remote & mic cable (features may vary), Hard Shell carrying case, Beats cleaning cloth
"People aren't hearing all the music.

Artists and producers work hard in the studio perfecting their sound. But people can't really hear it with normal headphones. Most headphones can't handle the bass, the detail, the dynamics. Bottom line, the music doesn't move you.

With Beats, people are going to hear what the artists hear, and listen to the music the way they should: the way I do." - Dr. Dre

It's All About Sound
Three years of thorough research and development resulted in the most incredible headphone speaker ever built. Beats features highly advanced materials and construction to deliver a new level of audio accuracy and clarity. Combining extra-large speaker drivers and a high-power digital amplifier, Beats delivers an unprecedented combination of super deep bass, smooth undistorted highs, and crystal clear vocals never heard before from headphones.

Less Noise, More Music With Powered Isolation
Today's digital audio recording technology gives music more detail than ever before. Unfortunately, the details get easily lost in today's noisy world: on the street, on the bus, on the plane. The best listening experience isn't just about what you hear, but what you don't. Monster's powered isolation technology actively cuts external noise, so you experience all the rich details your favorite artists want you to hear.

Extreme Comfort is Music to Your Ears
With Beats, you feel the music, not the headphones. Spacious earcups give you extra room for a higher level of listening comfort. Plush ear cushions covered with ultra-soft breathable materials keep you cool even when the music's hot.

Ready for iPhone?
Stay connected without missing a beat. Beats comes with an additional Monster iSoniTalk™ iPhone enable headphone cable with built-in answer button and microphone so you can easily stop rockin' and start talkin'.


Wired with Monster Cable
Advanced Monster™ Cable headphone cable with Quadripole™ 4 twisted pair construction reduces signal loss for balanced sound and clarity.

Micro Minijack
Compact connector design reduces bulk and eases unnecessary strain that can damage headphone connectors and ports.

Push To Listen
Integrated mute button lets you listen to the outside world without removing your Beats.

Folding Design
Beats fold into a compact shape for easy packing wherever you're going.

Touring Case
Rugged case with rigid construction keeps Beats and accessories safe during transport.

Monster CleanCloth
Ultra-soft cleaning cloth with AEGIS? microbe shield keeps Beats looking good and controls germs on your ear cushions.

Scratch-Resistant Gloss Finish
Advanced materials make Beats stand out from the crowd, and keep them looking good.

What's In The Box
  • Beats Studio headphones
  • Monster Cable headphone cable
  • Monster iSoniTalk™ iPhone enabled headphone cable
  • Rigid Tour case
  • Anti-Microbial Cleaning cloth
  • 1/8" to 1/4" Adapter
  • Two AAA batteries
One Year Limited Warranty

  • Weight: 260 grams, 270 grams with batteries
  • Headphone cable length: 1.3 meters
  • Connector: 1/8 inch (3.5mm), gold-plated

          Artifacts of Desktop Biotech        
BIOTECH PROTOTYPE POEMS + MIXED MEDIA IMAGES by Robert Bolton Artifacts of Desktop Biotech prototype poems + mixed media images  “Artifacts of Desktop Biotech” is an inventory of the recently possible; it documents emerging and possible manifestations of synthetic biology.  As the capabilities of biotech become increasingly democratized, we are seeing a shift from “lab […]
           Google Glass is Helping Scientists to Study Certain Brain Diseases        
Google Glass is a optical head-mounted display that is worn like a pair of eyeglasses. Using Google Glass, Siberian scientists are developing a prototype

          On your Arizona vacation dont miss attractions such as the Reid Park Zoo        
Make your vacation plans with one of the top online vacation guides to Arizona. Explore attractions in southeaster Arizona and visit attractions such as the Pinnacle Peak Park or the Arizona Trail if you are seeking the great outdoors. Or relax in luxury on an Arizona Spa vacation. Or take your children to the Toy Train Operating Museum. Here are a few of the Arizona attractions awaiting you on your vacation!

Visit Pinnacle Peak Park in Tucson, Arizona. You will find there is much more to Pinnacle Peak Park than simply hiking and climbing. Pinnacle Peak Park is located in North Scottsdale, Arizona — East of Pima Road, off Alma School Road between Happy Valley Road and Dynamite Boulevard. The park entrance is on 102nd Way, just west of Pinnacle Peak Patio restaurant. Trail Dust Town (where Pinnacle Peak is) has been a Tucson landmark for over 40 years. If you answered yes to these questions, Pinnacle Peak Park has a variety of volunteer opportunities available for you! Location: 6541 E. Tanque Verde Rd., Tucson, Arizona

Tubac Artist Colony - Tubac, Arizona. This community of southern Arizona is quickly becoming a growing artist colony. Although the Tucson Artist Colony is not an "Artist Colony" in the truest sense it is a group of some of the best artists in the Southwest. Tucson Artist Colony provides premier studio spaces but it is the Art Classes offered through the Tucson Artist Colony that really set it apart. Contact the artists through his/her website to sign up for classes. Tubac’s present incarnation as an artists’ colony began in the 40’s with the opening of Dale Nichol’s Artist School. Location: 45 miles south of Tucson off I-19.

Arizona Trail - Southern Border to Northern Border, Arizona. Dissecting the state of Arizona from its southern border with Mexico to the northern border of Utah this trail gives hikers, bikers and horseback riders an un-matched opportunity to enjoy the beautiful state of Arizona. There are 807 miles of trails to enjoy. The Arizona Trail is a continuous, 800+ mile diverse and scenic trail across Arizona from Mexico to Utah. Currently 94% of the trail is complete. The Arizona Trail Association's mission is simple: build, maintain, promote, protect and sustain the Arizona Trail as a unique encounter with the land. Use common sense when using the Arizona Trail.

Desert spa resorts in Arizona are especially popular during the winter months. Phoenix, Scottsdale and Tucson are home to several first class spa vacation resorts. The Boulders Resort features the Golden Door Spa which opened in 2001. The spa offers numerous treatments inspired by the Arizona desert surroundings. Arizona Biltmore, located in Phoenix, has a 22,000 square foot spa center, eight heated pools and three outdoor whirlpool spas. The resort spa uses Sonoran desert plants, stones and mud as a basis for its innovative spa treatments. Arizona features several well-known destination spas. The health spa offers over 100 facial and body treatment options.

Tour the Reid Park Zoo in Tucson, Arizona. The Reid Park Zoo is home to more than 400 animals. You’ll see all of your favorite zoo animals in comfortable habitat environments located throughout Reid Park Zoo. The Reid Park Zoo features an impressive assortment of beautiful Asian Animals, including Gibbons, bears and tigers. You’ll also see majestic Asian tigers at the Zoo, another endangered species the Zoo is active in protecting. African animals are well represented at the Reid Park Zoo. Brightly colored Macaws are another type of bird exhibited at the Reid Park Zoo. You’ll gain a unique understanding and appreciation of the effort it takes to keep the Zoo’s animal residents healthy, happy and safe at the Reid Park Zoo.

Fort Lowell Museum - Tucson, Arizona. Troop strength at Fort Lowell averaged 130 officers and 239 enlisted men. Serving at Fort Lowell were companies representing the 2nd, 4th 5th and 6th Cavalry Regiments, and the 1st, 8th, and 12th Infantry Regiments. The buildings at Fort Lowell reflected a Mexican Sonoran style of architecture. Currently to the west of the Fort Lowell Park the Commissary building and ruins of the hospital remain. The one intact Officers Quarters on the Adkins Steel parcel representing the most complete original structure from the 1870s Fort Lowell. Since 1963 the Arizona Historical Society has operated a branch Museum at the Fort Lowell Historic Site. Location: 2900 N. Craycroft Road, Tucson, Arizona - The museum is located in Old Fort Lowell Park at the corner of Cracroft and Fort Lowell Road in Tucson.

Gadsden Pacific Division Toy Train Operating Museum - Tucson, Arizona. There are 90 members of this organization dedicated to the advancement of model railroading by the collection and operation of toy trains and railroad memorabilia, as well as preserving prototype railroad history by sponsoring railroading related activities and events to share with the public. O-Gauge, G-Scale, Standard Gauge, S-Gauge, HO-Scale, N-Scale, Z-Scale displays are all set up for the showing of the different styles of model trains at the museum. GPD hosts two toy train shows/swap meets annually, now called the Winter Toy Train (formerly Coyote) and Summer Toy Train (formerly Roadrunner) Shows.
Penelope SanMateo is a travel writer for Arizona Beautiful. Her travel articles provide insights into attractions and events you don't want to miss while on your Arizona vacation. She recently visited the Reid Park Zoo while touring southern Arizona and the city of Tucson. The zoo offers an impressive assortment of animals from asian bears and tigers to a variety of african animals including a white rhinocerous.
          HTC's 5-inch behemoth phone spied in press render, may be called One X 5 (update: some caution)        

HTC One X 5 leak

Something big has been brewing at HTC, most often referred to as the DLX or by its less-than-flattering 6435LVW name. While there have been unconfirmed photos of prototypes floating around, a Sina Weibo user has posted what we have reason to believe is an authentic press image of the finished result: meet the One X 5. As the name and image suggest, the phablet-class device should be dominated by a 5-inch (and possibly 1080p) screen that makes even a regular One X look dainty. Internal details haven't been nailed down alongside the looks, although previous benchmarks have had it using a Snapdragon S4 that might ultimately be a quad-core S4 Pro. There's no immediate signs of a stylus or other tricks besides that sea of glass. We may not have long to wait before we find out, however. HTC just happens to have a New York City event planned for this Wednesday, and previously detected links between the 6435LVW and Verizon could see the One X 5 quickly reach the US if it's meant to show at that gathering -- although it might get another name change to fit into the Droid family.

Update: We've been given a heads-up that this is a device posted as a "concept" by Danny Tu on Flickr, which raises doubt that we'll see exactly what's on show here (or see that name). However, it still lines up with earlier photos and what we've heard. We'll keep you posted as to whether or not it reflects reality in the end.

Source: Sina Weibo (sign-in required)

          2016 总结 & 停更公告        

天天摸鱼, 不知不觉就摸到了 2016 年的年末, 于是打算继续摸到明年, 顺便宣布停更博客直到新的博客程序上线.

今年的博客更新得比较勤快, 不是默默无闻的一年, 长篇大论的文章写得比较多, 写废话功力见涨, 毒舌现象更加严重, 几乎每一篇文章都或多或少的在鄙视一些人(包括这一篇).

开始往游戏方面的文章里参杂私货, 摆弄一些无聊的豆知识, 最近还常常冒出与游戏界"政治正确"相异的见解, 拒了多家游戏媒体/网站抛来的橄榄枝, 目前打算自己开发独立游戏.

继去年发布了多看助手之后, 今年的 Chrome 扩展程序也顺理成章写了不少, 以致于都快忘记了自己是干啥的, 也常常被别人当成"写插件的"...

开始重视个人的财务管理, 初入 A 股市场, 总收益17%, 已放弃继续投资其他理财产品的打算, 认识到自己有着比较强的风险偏好.

总的来说, 还是各种方面都有所收获的一年, 虽然描述起来很流水账吧, 但确实还是做了一些有意义的事情.



重启了豆瓣的记录之后, 成功将豆瓣"看过的影视作品"数量刷到700部, 下一个目标是1000部.



我的打分相对一些V来说还是比较大众化的, 兴许可以当作补片的参考.

整理评分的时候我发现现在豆瓣政治正确的风气很严重, 用脚投票的现象也很严重, 打分的波动性很大, 有很多片的分数跟我打分时差得很远, 不知道是因为有盗版片源可看所以降低了打分成本还是别的什么原因导致的, 分数基本都是在向下走. 更令人奇怪的是, 今年的一些烂片仍然维持在被人当好片的分数, 比如浪费导演才能纯靠IP支撑票房的垃圾改编电影《魔兽》可以维持在7.8分; 星爷导演的老牌情怀烂片《美人鱼》可以维持在6.9分; 被大河内的傻逼剧本彻底毁了的《甲铁城的卡巴内瑞》可以维持在6.9分; 原创弱智剧情的《亚人 第二季》可以被当成冷门佳片维持在8.8分; 还有上面我给4星的JOJO第四部、黑镜S3、东离剑游纪竟然到今天还能处于比较高的分数, 真是惊了.

有的时候我真的会怀疑到底是我有问题, 还是这个社会有问题.

不过也就这样了, 不然还能哪样呢, 也许我应该做一个插件把评分都隐藏, 根据评分的分布情况自动显示一个也许更合理的文字评价.

每年一度的总结里也写了这么多废话, 唉, 该歇了, 明年见.

          ZombiU 2 prototype development has begun, Ubisoft dev confirms        
ZombiU 2 is being prototyped as you read this, according to Ubisoft Montpelier’s creative director Jean-Philippe Caro. Responding to a keen fan on Twitter, Caro wrote the following: It follows comments made by the team back in January, that suggested co-op may be a feature in whatever the studio is working on now. Would you […]
          Halo Lamp beta        
Show your Halo pride with this simple LED light. I saw this design on Xbox live years ago and loved it. So I made a T-shirt of it over at for a buddy. Turns out, a lot of people dug it. So I’m trying my hand at this simple design. I think it came out pretty cool, but I’m considering it the prototype, the beta, the 1.0.0 version, etc. This was my first Ponoko project so I’ve learned a few things and have improvements for this project in mind. Depending on the response I’ll consider a 2.0 version. And also a deluxe version of all plastic or something. An assembly Instructable is here:

          A Thing of Beauty        
I’ve long been a fan of the copper tools from implementations and Jane was showing prototypes of her latest design at Malvern. The British-made sieves will be added to the range shortly – in the meantime I will mention once more say that her Castor trowel is my favourite tool – ever.
          The business class where high school students are running the company        
Students at Herndon High School in Virginia talk business during their Virtual Enterprises International (VE) class.

Students at Herndon High School in Virginia talk business during their Virtual Enterprises International (VE) class.

As both a business owner and student, 20-year-old entrepreneur Bobby Lenahan thought the idea of starting his own venture while helping others sounded really cool.

Lenahan’s invention — IV Hero, a superhero-themed paper sleeve that fits over IV bags to make kids feel less scared — is now used in three hospitals, including Memorial Sloan Kettering Cancer Center, and soon two more. He used a McDonald’s french fry container as his earliest prototype.

Lenahan, an alum of William Floyd High School in Mastic, New York, credits a high school business class with giving him the skills and drive to start his own company.

“It’s not just a PowerPoint where teachers are telling you about the income statement. No, you have to go and make the income statement.” Lenahan said of the class that lets students run their own business for a year.

“This class is entirely a hands-on approach — quite different from what we’re used to.”

In Virtual Enterprises International, also known as VE, students write a business plan, buy and sell products online with other high schools (no currency is actually exchanged), maintain personal and company bank accounts, work with mentors and compete in state and international competitions.

Lenahan was vice president of accounting for his VE class, which created a fitness equipment sales company, Forever Fit by Floyd. The students are “very driven,” said Lenahan. “They’re the movers and shakers of tomorrow. And you can tell they are going to do it. And VE is what offers them those skills.”

The New York City Department of Education began the VE program in 1996 after the Department of Education sent Iris Blanc, an assistant principal at the time, to Austria in order to study how schools incorporated business simulation classes into their education system. The business programs Blanc observed had grown out of the European apprenticeship model, which relied on the “learn by doing” principle.

Although it started as a pilot project in just seven New York City schools, word soon spread to other states. Within five years, 65 high schools had started VE programs. In 2011, VE became a nonprofit organization, and today, VE is in 400 high schools across 19 states and is part of an international network of schools in 42 countries. Just under 50 percent of students are from low-income and minority backgrounds.

Blanc says that the transformation of the kids is the best part of her job. “Some completely unmotivated by school have now identified their own talents and self-confidence,” she said.

“This class is entirely a hands-on approach — quite different from what we’re used to,” said Cyril Antoney, a current 11th grade VE student at Herndon High School in Virginia. He attributes his entrepreneurial spirit and belief in the American dream to the class, his teacher and his family, who immigrated to the U.S. from India when he was 5.

“I think if you have high expectations of kids, they will achieve that level,” said Antoney’s teacher Kathy Thomas, who’s been teaching the class for the past seven years.

Antoney’s class in Herndon started ScholarME, a software company that helps students find college scholarships. ScholarME took first place in the business plan state competition, which qualified them to compete in the international trade show in New York in April. The competition is similar to Shark Tank, “except,” he said, “the sharks are a lot nicer.”

Twenty states in the U.S. highlighted in blue require students to take an economics course, according to a 2016 report by the Council for Economic Education. States with a black triangle must offer economics, states in bold require implementation of state standards and states with a circle require standardized testing.

Twenty states in the U.S., highlighted in blue, require students to take an economics course, according to a 2016 report by the Council for Economic Education. States with a black triangle must offer economics, states in bold require implementation of state standards and states with a circle require standardized testing.

Carly Shay, a senior at Lakeland High School in White Lake, Michigan, said VE pushed her into a new career path. Shay had planned to go into the culinary arts, but after taking VE for three years and overcoming her shyness as CFO and then CEO, she decided she wanted to major in accounting when she starts at the University of Akron next fall.

“I think it’s imperative we prepare kids for the business world….Acquiring the knowledge is fine, but if you can’t use it, it doesn’t do you a whole lot of good.”

Shay’s VE teacher, Wendy Schmitt, recalled another student, whom she had hardly heard say more than a few words all year, yell across the trade floor at a state competition: “Mrs. Schmitt, it turns out I’m an outstanding salesman.” She thought to herself, “I don’t care what happens the rest of the year because [this student] is an outstanding salesman.”

While VE continues to grow nationwide, only 20 states require high school students to take an economics class, and just 17 states require courses in personal finance, according to the Council for Economic Education.

“I think it’s imperative we prepare kids for the business world,” said Richard Stacy, who owns a consultancy business with his wife Karen Stacy. Both are professors at Metropolitan School of Professional Studies at Catholic University in Washington, D.C., and also mentor the Herndon students. “Acquiring the knowledge is fine, but if you can’t use it, it doesn’t do you a whole lot of good,” Stacy said.

The post The business class where high school students are running the company appeared first on PBS NewsHour.

          TurnKey 13 out, TKLBAM 1.4 now backup/restores any Linux system        

This is really two separate announcements rolled into one:

  1. TurnKey 13 - codenamed "satisfaction guaranteed or your money back!"

    The new release celebrates 5 years since TurnKey's launch. It's based on the latest version of Debian (7.2) and includes 1400 ready-to-use images: 330GB worth of 100% open source, guru integrated, Linux system goodness in 7 build types that are optimized and pre-tested for nearly any deployment scenario: bare metal, virtual machines and hypervisors of all kinds, "headless" private and public cloud deployments, etc.

    New apps in this release include OpenVPN, Observium and Tendenci.

    We hope this new release reinforces the explosion in active 24x7 production deployments (37,521 servers worldwide) we've seen since the previous 12.1 release, which added 64-bit support and the ability to rebuild any system from scratch using TKLDev, our new self-contained build appliance (AKA "the mothership").

    To visualize active deployments world wide, I ran the access logs through GeoIPCity and overlaid the GPS coordinates on this Google map (view full screen):


  2. TKLBAM 1.4 - codenamed "give me liberty or give me death!"

    Frees TKLBAM from its shackles so it can now backup files, databases and package management state without requiring TurnKey Linux, a TurnKey Hub account or even a network connection. Having those will improve the usage experience, but the new release does its best with what you give it.

    I've created a convenience script to help you install it in a few seconds on any Debian or Ubuntu derived system:

    wget -O - -q $URL | PACKAGE=tklbam /bin/bash

    There's nothing preventing TKLBAM from working on non Debian/Ubuntu Linux systems as well, you just need to to install from source and disable APT integration with the --skip-packages option.

    Other highlights: support for PostgreSQL, MySQL views & triggers, and a major usability rehaul designed to make it easier to understand and control how everything works. Magic can be scary in a backup tool.

    Here's a TurnKey Hub screenshot I took testing TKLBAM on various versions of Ubuntu:

    Screenshot of TurnKey Hub backups

Announcement late? Blame my problem child

As those of you following TurnKey closely may have already noticed, the website was actually updated with the TurnKey 13.0 images a few weeks ago.

I was supposed to officially announce TurnKey 13's release around the same time but got greedy and decided to wrap up TKLBAM 1.4 first and announce them together.

TKLBAM 1.4 wasn't supposed to happen. That it did is the result of a spontaneous binge of passionate development I got sucked into after realizing how close I was to making it a lot more useful to a lot more people. From the release notes:

More people would find TKLBAM useful if:

  • If it worked on other Linux distributions (e.g., Debian and Ubuntu to begin with)

  • If users understood how it worked and realized they were in control. Magic is scary in a backup tool.

  • If it worked without the TurnKey Hub or better yet without needing a network connection at all.

  • If users realized that TKLBAM works with all the usual non-cloud storage back-ends such as the local filesystem, rsync, ftp, ssh, etc.

  • If users could more easily tell when something is wrong, diagnose the problem and fix it without having to go through TKLBAM's code or internals

  • If users could mix and match different parts of TKLBAM as required (e.g., the part that identifies system changes, the part that interfaces with Duplicity to incrementally update their encrypted backup archives, etc.)

  • If users could embed TKLBAM in their existing backup solutions

  • If users realized TKLBAM allowed them to backup different things at different frequencies (e.g., the database every hour, the code every day, the system every week)

    Monolithic all-or-nothing system-level backups are not the only way to go.

  • If it could help with broken migrations (e.g., restoring a backup from TurnKey Redmine 12 to TurnKey Redmine 13)

  • If it worked more robustly, tolerated failures, and with fewer bugs

So that's why the release announcement is late and Alon is slightly pissed off but I'm hoping the end result makes up for it.

TurnKey 13: from 0.5GB to 330GB in 5 years

Big things have small beginnings. We launched TurnKey Linux five years ago in 2008 as a cool side project that took up 0.5GB on SourceForge and distributed 3 installable Live CD images of LAMP stack, Drupal and Joomla.

5 years later the project has ballooned to over 330GB spanning 1400 images: 100 apps, 7 build types, in both 64-bit and 32-bit versions. So now we're getting upset emails from SourceForge asking if the project really needs to take up so much disk space.

Yes, and sorry about that. For what it's worth, realizing TurnKey may eventually outgrow SourceForge is part of the reason we created our own independent mirror network (well, that and rsync/ftp access). Sourceforge is great, but just in case...

93,555 lines of code in 177 git repos

In terms of development, I recently collected stats on the 177 git repositories that make up the app library, self-contained build system, and a variety of custom components (e.g., TKLBAM, the TurnKey Hub).

It turns out over the years we've written about 93,555 lines of code just for TurnKey, most of it in Python and shell script. Check it out:

Late but open (and hopefully worth it)

TurnKey 13 came out a few months later than we originally planned. By now we have a pretty good handle on what it takes to push out a release so the main reason for the delay was that we kept moving the goal posts.

In a nutshell, we decided it was more important for the next major TurnKey release to be open than it was to come out early.

The main disadvantage was that Debian 7 ("Wheezy") had come out in the meantime and TurnKey 12 was based on Debian 6 ("Squeeze"). On the other hand Debian 6 would be supported for another year and since TurnKey is just Debian under the hood nothing prevented impatient users who wanted to upgrade the base operating system to Debian 7 to go through the usual automated and relatively painless Debian upgrade procedure.

So we first finished work on TKLDev, put it through the trenches with the TurnKey 12.1 maintenance release, and moved the project's development infrastructure to GitHub where all development could happen out in the open.

We hoped to see a steady increase in future open source collaboration on TurnKey's development and so far so good. I don't expect the sea to part as it takes more than just the right tools & infrastructure to really make an open source project successful. It takes community and community building takes time. TurnKey needs to win over contributors one by one.

Alon called TurnKey 13.0 "a community effort" which I think in all honesty may have been a bit premature, but we are seeing the blessed beginnings of the process in the form of a steadily growing stream of much appreciated community contributions. Not just new prototype TurnKey apps and code submissions but also more bug reports, feature requests and wiki edits.

And when word gets out on just how fun and easy it is to roll your own Linux distribution I think we'll see more of that too. Remember, with TKLDev, rolling your own Debian based Linux distribution is as easy as running make:

root@tkldev ~$ cd awesomenix
root@tkldev turnkey/awesomenix$ make

You don't even have to use TKLDev to build TurnKey apps or use any TurnKey packages or components. You can build anything you want!

Sadly, I've gotten into the nasty habit of prepending TKL - the TurnKey initials - to all the TurnKey related stuff I develop but under the hood the system is about as general purpose as it can get. It's also pretty well designed and easy to use, if I don't (cough) say so myself.

I'll be delighted if you use TKLDev to help us improve TurnKey but everyone is more than welcome to use it for other things as well.

3 new TurnKey apps - OpenVPN, Tendenci and Observium

  • OpenVPN: a full-featured open source SSL VPN solution that accommodates a wide range of configurations, including remote access, site-to-site VPNs, Wi-Fi security, and more.

    Matt Ayers from Amazon asked us to consider including an OpenVPN appliance in the next release and Alon blew it out of the park with the integration for this one.

    The new TurnKey OpenVPN is actually a 3 for 1 - TurnKey's setup process asks whether you want OpenVPN in client, server or gateway mode and sets things up accordingly.

    My favourite feature is the one that allows the admin to create self destructing URLs with scannable QRcodes that makes setting up client OpenVPN profiles on mobiles a breeze. That's pretty cool.

  • Tendenci: a content management system built specifically for NPOs (Non Profit Organizations).

    Upstream's Jenny Qian did such an excellent job developing the new TurnKey app that we accepted it into the library with only a few tiny modifications.

    This is the first time an upstream project has used TKLDev to roll their own TurnKey app. It would be awesome to see more of this happening and we'll be happy to aid any similar efforts in this vain any way we can.

  • Observium: a really cool autodiscovering SNMP based network monitoring platform.

    The new TurnKey app is based on a prototype developed by Eric Young, who also developed a few other prototype apps which we plan on welcoming into the library as soon as we work out the kinks. Awesome work Eric!

Special thanks

Contributing developers:

Extra special thanks

  • Alon's wife Hilla: for putting up with too many late work sessions.
  • Liraz's girlfriend Shir: for putting up with such a difficult specimen (in general).

          TKLBAM: a new kind of smart backup/restore system that just works        

Drum roll please...

Today, I'm proud to officially unveil TKLBAM (AKA TurnKey Linux Backup and Migration): the easiest, most powerful system-level backup anyone has ever seen. Skeptical? I would be too. But if you read all the way through you'll see I'm not exaggerating and I have the screencast to prove it. Aha!

This was the missing piece of the puzzle that has been holding up the Ubuntu Lucid based release batch. You'll soon understand why and hopefully agree it was worth the wait.

We set out to design the ideal backup system

Imagine the ideal backup system. That's what we did.

Pain free

A fully automated backup and restore system with no pain. That you wouldn't need to configure. That just magically knows what to backup and, just as importantly, what NOT to backup, to create super efficient, encrypted backups of changes to files, databases, package management state, even users and groups.

Migrate anywhere

An automated backup/restore system so powerful it would double as a migration mechanism to move or copy fully working systems anywhere in minutes instead of hours or days of error prone, frustrating manual labor.

It would be so easy you would, shockingly enough, actually test your backups. No more excuses. As frequently as you know you should be, avoiding unpleasant surprises at the worst possible timing.

One turn-key tool, simple and generic enough that you could just as easily use it to migrate a system:

  • from Ubuntu Hardy to Ubuntu Lucid (get it now?)
  • from a local deployment, to a cloud server
  • from a cloud server to any VPS
  • from a virtual machine to bare metal
  • from Ubuntu to Debian
  • from 32-bit to 64-bit

System smart

Of course, you can't do that with a conventional backup. It's too dumb. You need a vertically integrated backup that has system level awareness. That knows, for example, which configuration files you changed and which you didn't touch since installation. That can leverage the package management system to get appropriate versions of system binaries from package repositories instead of wasting backup space.

This backup tool would be smart enough to protect you from all the small paper-cuts that conspire to make restoring an ad-hoc backup such a nightmare. It would transparently handle technical stuff you'd rather not think about like fixing ownership and permission issues in the restored filesystem after merging users and groups from the backed up system.

Ninja secure, dummy proof

It would be a tool you could trust to always encrypt your data. But it would still allow you to choose how much convenience you're willing to trade off for security.

If data stealing ninjas keep you up at night, you could enable strong cryptographic passphrase protection for your encryption key that includes special countermeasures against dictionary attacks. But since your backup's worst enemy is probably staring you in the mirror, it would need to allow you to create an escrow key to store in a safe place in case you ever forget your super-duper passphrase.

On the other hand, nobody wants excessive security measures forced down their throats when they don't need them and in that case, the ideal tool would be designed to optimize for convenience. Your data would still be encrypted, but the key management stuff would happen transparently.

Ultra data durability

By default, your AES encrypted backup volumes would be uploaded to inexpensive, ultra-durable cloud storage designed to provide %99.999999999 durability. To put 11 nines of reliability in perspective, if you stored 10,000 backup volumes you could expect to lose a single volume once every 10 million years.

For maximum network performance, you would be routed automatically to the cloud storage datacenter closest to you.

Open source goodness

Naturally, the ideal backup system would be open source. You don't have to care about free software ideology to appreciate the advantages. As far as I'm concerned any code running on my servers doing something as critical as encrypted backups should be available for peer review and modification. No proprietary secret sauce. No pacts with a cloudy devil that expects you to give away your freedom, nay worse, your data, in exchange for a little bit of vendor-lock-in-flavored convenience.

Tall order huh?

All of this and more is what we set out to accomplish with TKLBAM. But this is not our wild eyed vision for a future backup system. We took our ideal and we made it work. In fact, we've been experimenting with increasingly sophisticated prototypes for a few months now, privately eating our own dog food, working out the kinks. This stuff is complex so there may be a few rough spots left, but the foundation should be stable by now.

Seeing is believing: a simple usage example

We have two installations of TurnKey Drupal6:

  1. Alpha, a virtual machine on my local laptop. I've been using it to develop the TurnKey Linux web site.
  2. Beta, an EC2 instance I just launched from the TurnKey Hub.

In the new TurnKey Linux 11.0 appliances, TKLBAM comes pre-installed. With older versions you'll need to install it first:

apt-get update
apt-get install tklbam webmin-tklbam

You'll also need to link TKLBAM to your TurnKey Hub account by providing the API-KEY. You can do that via the new Webmin module, or on the command line:

tklbam-init QPINK3GD7HHT3A

I now log into Alpha's command line as root (e.g., via the console, SSH or web shell) and do the following:


It's that simple. Unless you want to change defaults, no arguments or additional configuration required.

When the backup is done a new backup record will show up in my Hub account:

To restore I log into Beta and do this:

tklbam-restore 1

That's it! To see it in action watch the video below or better yet log into your TurnKey Hub account and try it for yourself.

Quick screencast (2 minutes)

Best viewed full-screen. Having problems with playback? Try the YouTube version.

The screencast shows TKLBAM command line usage, but users who dislike the command line can now do everything from the comfort of their web browser, thanks to the new Webmin module.

Getting started

TKLBAM's front-end interface is provided by the TurnKey Hub, an Amazon-powered cloud backup and server deployment web service currently in private beta.

If you don't have a Hub account already, request an invitation. We'll do our best to grant them as fast as we can scale capacity on a first come, first served basis. Update: currently we're doing ok in terms of capacity so we're granting invitation requests within the hour.

To get started log into your Hub account and follow the basic usage instructions. For more detail, see the documentation.

Feel free to ask any questions in the comments below. But you'll probably want to check with the FAQ first to see if they've already been answered.

Upcoming features

  • PostgreSQL support: PostgreSQL support is in development but currently only MySQL is supported. That means TKLBAM doesn't yet work on the three PostgreSQL based TurnKey appliances (PostgreSQL, LAPP, and OpenBravo).
  • Built-in integration: TKLBAM will be included by default in all future versions of TurnKey appliances. In the future when you launch a cloud server from the Hub it will be ready for action immediately. No installation or initialization necessary.
  • Webmin integration: we realize not everyone is comfortable with the command line, so we're going to look into developing a custom webmin module for TKLBAM. Update: we've added the new TKLBAM webmin module to the 11.0 RC images based on Lucid. In older images, the webmin-tklbam package can also be installed via the package manager.

Special salute to the TurnKey community

First, many thanks to the brave souls who tested TKLBAM and provided feedback even before we officially announced it. Remember, with enough eyeballs all bugs are shallow, so if you come across anything else, don't rely on someone else to report it. Speak up!

Also, as usual during a development cycle we haven't been able to spend as much time on the community forums as we'd like. Many thanks to everyone who helped keep the community alive and kicking in our relative absence.

Remember, if the TurnKey community has helped you, try to pay it forward when you can by helping others.

Finally, I'd like to give extra special thanks to three key individuals that have gone above and beyond in their contributions to the community.

By alphabetical order:

  • Adrian Moya: for developing appliances that rival some of our best work.
  • Basil Kurian: for storming through appliance development at a rate I can barely keep up with.
  • JedMeister: for continuing to lead as our most helpful and tireless community member for nearly a year and a half now. This guy is a frigging one man support army.

Also special thanks to Bob Marley, the legend who's been inspiring us as of late to keep jamming till the sun was shining. :)

Final thoughts

TKLBAM is a major milestone for TurnKey. We're very excited to finally unveil it to the world. It's actually been a not-so-secret part of our vision from the start. A chance to show how TurnKey can innovate beyond just bundling off the shelf components.

With TKLBAM out of the way we can now focus on pushing out the next release batch of Lucid based appliances. Thanks to the amazing work done by our star TKLPatch developers, we'll be able to significantly expand our library so by the next release we'll be showcasing even more of the world's best open source software. Stir It Up!

          Cover Reveal: Oracle        
One of these days I will figure out this whole self-publishing thing so I can manage an actual cover reveal/promotion with more than ten days before the publication date. But because of reasons, here's the cover for ORACLE: The Project Files Part 1. 

I really do love this. Robin Ludwig Designs did a great job, and I couldn't be happier.

Pre-order at Amazon here.


Dr. Dean Frey is a man of science. His lifelong desire to create a better future for mankind has led him to the prestigious, and highly mysterious, Wilderness Institute of Scientific Research & Technology, as the head of their Robotics Engineering department. Building on the research and designs of others before him, Dean’s own genius culminates in the successful creation of Anthony—the first fully-automated, free-thinking android prototype. And now Wilderness wants to sell Anthony to the military.

Unwilling to allow his achievement to become weaponized, Dean reaches out to a former Wilderness employee with the resources to help him steal Anthony and relocate them both to safety. He’s put into contact with the very secretive Nick and Olivia, who ask for one simple thing in return: trust us, no matter what you see or hear. Blind trust isn’t in Dean’s cautious nature, but he has no other choice.

For telekinetic Olivia, rescuing a fellow Psion from a life of imprisonment and experimentation is one of her favorite things. Being paid is nice, but she’ll do the job for free, if it means giving Wilderness the finger. When Olivia’s reclusive mentor solicits her and her telepathic partner Nick’s help in smuggling a very special Project out of Wilderness, they jump at the chance to infiltrate their former home and do some internal damage to the institute that created them.

With their combined knowledge of the facility, breaking Anthony out of Wilderness should have been easy—but Olivia learned a long time ago to never underestimate her enemies, or the lengths they’ll go to retrieve what’s theirs. And this time, the price for stealing the Project may be more than she’s willing to pay.


Other vendor links coming soon.

          I HAVE BOOK NEWS        
I've been somewhat vague about upcoming projects, because I'm still getting the hang of this self-publishing thing, and I'm always afraid I'll jinx myself if I speak before I have my ducks in a row.

Right now, my ducks are all at least in the same pond, so here's what's cooking, by way of a long story: way back in college, when I was still considering a career as a screenwriter, I began working on what was then a pilot for a TV show featuring characters with telepathic/telekinetic abilities. I wrote two episodes, and then put them away because what was someone from Delaware going to do with TV show scripts, right?

Not long after that, I got back into writing prose fiction, thanks to participating in fanfiction forums online. So I pulled those scripts out of the drawer and rewrote them as novels. First they were two long novels. Then three short novels. Then one long novel. Over the years, they've been rewritten, because my writing skills have vastly improved in the last 13-odd years. I removed unnecessary flashbacks. Killed a small subplot. Tightened what was actually happening.

And now they're ready for you guys. As a complete duology, with book one releasing this summer, and book two in the fall. Don't worry, there are no major cliffhangers at the end of book one. I wouldn't do that to you guys.

ORACLE: The Project Files Part 1 is scheduled for release on July 18th. I will have early paperback copies for sale at the Meet the Pros event at Shore Leave 38 that weekend, for folks who attend. I've long described the books as "The A-Team with superpowers," if that piques your interest at all.

I'm super-excited to be working with the very talented Robin Ludwig Design Inc. on my cover art. She did the artwork for Requiem for the Dead and The Night Before Dead, and she killed it both times.

And to give you an idea of what to expect from this book, the finalized back cover blurb is below.


Dr. Dean Frey is a man of science. His lifelong desire to create a better future for mankind has led him to the prestigious, and highly mysterious, Wilderness Institute of Scientific Research & Technology, as the head of their Robotics Engineering department. Building on the research and designs of others before him, Dean’s own genius culminates in the successful creation of Anthony—the first fully-automated, free-thinking android prototype. And now Wilderness wants to sell Anthony to the military.

Unwilling to allow his achievement to become weaponized, Dean reaches out to a former Wilderness employee with the resources to help him steal Anthony and relocate them both to safety. He’s put into contact with the very secretive Nick and Olivia, who ask for one simple thing in return: trust us, no matter what you see or hear. Blind trust isn’t in Dean’s cautious nature, but he has no other choice.

For telekinetic Olivia, rescuing a fellow Psion from a life of imprisonment and experimentation is one of her favorite things. Being paid is nice, but she’ll do the job for free, if it means giving Wilderness the finger. When Olivia’s reclusive mentor solicits her and her telepathic partner Nick’s help in smuggling a very special Project out of Wilderness, they jump at the chance to infiltrate their former home and do some internal damage to the institute that created them.

With their combined knowledge of the facility, breaking Anthony out of Wilderness should have been easy—but Olivia learned a long time ago to never underestimate her enemies, or the lengths they’ll go to retrieve what’s theirs. And this time, the price for stealing the Project may be more than she’s willing to pay.


Acegi认证授权主要基于两大技术,一是Filter机制,二是AOP的拦截机制。通过FilterSecurityInterceptor很好地实现了对URI的保护,通过MethodSecurityInterceptor实现了对Service的方法的拦截保护,通过ACL 实现了对prototype类型的Object进行过滤和保护。

HttpSessionContextIntegrationFilter 存储SecurityContext in HttpSession
ChannelProcessingFilter 重定向到另一种协议,如http到https

ConcurrentSessionFilter 因为不使用任何SecurityContextHolder的功能,但是需要更新SessionRegistry来表示当前的发送请求的principal,通过在web.xml中注册Listener监听Session事件,并发布相关消息,然后由SessionRegistry获得消息以判断当前用户的Session数量。

AuthenticationProcessingFilter 普通认证机制(大多数用这个)

CasProcessingFilter CAS认证机制

BasicProcessingFilter Http协议的Basic认证机制

HttpRequestIntegrationFilter Authentication 从容器的HttpServletRequest.getUserPrincipal()获得

JbossIntegrationFilter 与Jboss相关。

SecurityContextHolderAwareRequestFilter 与servlet容器结合使用。

RememberMeProcessingFilter 基于Cookies方式进行认证。

AnonymousProcessingFilter 匿名认证。

ExceptionTranslationFilter 捕获所有的Acegi Security 异常,这样要么返回一个HTTP错误响应或者加载一个对应的AuthenticationEntryPoint

AuthenticationEntryPoint 认证入口

1、FilterToBeanProxy 负责代理请求给FilterChainProxy

2、FilterChainProxy 方便的将多个Filter串联起来,如上面基本概念中提到的各种Filter,当然如果对URI进行授权保护,也可以包含FilterSecurityInterceptor。注意各Filter的顺序。

3、AbstractSecurityInterceptor 调度中心。负责调用各模块完成相应功能。
FilterSecurityInterceptor 对URI进行拦截保护
AspectJSecurityInterceptor 对方法进行拦截保护
MethodSecurityInterceptor 对方法进行拦截保护

4、AuthenticationManager 用户认证
-> AuthenticationProvider 实际进行用户认证的地方(多个)。
-> UserDetailsService 返回带有GrantedAuthority的UserDetail或者抛出异常。

5、AccessDecisionManager(UnanimousBased/AffirmativeBased/ConsensusBased) 授权
-> AccessDecisionVoter(RoleVoter/BaseAclEntryVoter) 实际投票的Voter(多个).

6、RunAsManager 变更GrantedAuthority

7、AfterInvocationManager 变更返回的对象
-> BaseInvocationProvider 实际完成返回对象变更的地方(多个)。


草儿 2007-12-16 21:31 发表评论

          Startup Gospel        
I just read this twice.

A few parts I found particularly compelling:

What matters is not ideas, but the people who have them. Good people can fix bad ideas, but good ideas can't save bad people.
There's a great line right before this about how a VC will tell you to piss off if you ask them to sign an NDA. The inconvenience of signing the damn thing is worth more to them than your stupid idea. 
The only way to make something customers want is to get a prototype in front of them and refine it based on their reactions. 
Ohhhh, so spending a year in "stealth mode" isn't actually the way to go? Whew. 
To make something users love, you have to understand them. And the bigger you are, the harder that is. So I say "get big slow." The slower you burn through your funding, the more time you have to learn.
The most important way to not spend money is by not hiring people. I may be an extremist, but I think hiring people is the worst thing a company can do. 
Funny how I just posted a craigslist ad looking for more help. 
As with office space, the number of your employees is a choice between seeming impressive, and being impressive. Any of you who were nerds in high school know about this choice. Keep doing it when you start a company.
This was after a several paragraphs about how cheap office space was in the early days. There's a line about commuting home to the burbs. "god help you" he says. Heh.

And the closest to home for me these days:
During this time you'll do little but work, because when you're not working, your competitors will be. My only leisure activities were running, which I needed to do to keep working anyway, and about fifteen minutes of reading a night. I had a girlfriend for a total of two months during that three year period. Every couple weeks I would take a few hours off to visit a used bookshop or go to a friend's house for dinner. I went to visit my family twice. Otherwise I just worked.

          Développeur Front END confirmé /sénior (H/F) - Aduneo - Val de Fontenay         
Nous recrutons un développeur Front End confirmé /sénior vous serez amené à Mettre en place des solutions logicielles novatrices afin de digitaliser des processus métiers. Participer aux développements des outils de gestion d'identités mis en place par Aduneo. Réaliser des prototypes d'applications et proposer des visuels Missions et activités spécifiques du poste -Analyser les besoins, - Identifier les problématiques soulevées pour proposer des solutions optimales. -...
          Minute actu: lauréat du hackaton de Dakar        

Christine Traoré, Djibril Cissé et Alioune Niang sont les lauréats du hackathon de Dakar, une compétition organisé dans le cadre du programme Afrique innovation-Réinventer les médias initié par CFI Medias et Code for Africa. En une nuit, ils ont réalisé un prototype de Minute Actu, une application mobile qui devrait permettre à la jeunesse sénégalaise de […]

Cet article Minute actu: lauréat du hackaton de Dakar est apparu en premier sur TechOfAfrica.

          Musical Instruments Restoration #8: Violin 2 - Jig for the ribs on action        

Took me a little while to do the steel work on the jig.
Have to make the somewhat C-clamp with push block. Cutting threaded bars and looking for the nuts. Shaping the push blocks (made from ebony) become tiresome after making 16 pieces. These are made from scraps.

Here are some self explanatory photos… I give notes on some important aspects and ideas that you can grab suited for your project and availability of materials:

1) The piece that was bended was narra ripped by handsaw with vertical grain orientation; sanded to the final thickness of 1 mm thick. I tried the longitudinal way (horizontal grain orientation) but it splits because narra is medium hard wood and brittle. On vertical grain orientation, the grain has strong quality vertically and during the bending, some portion (specially those in the sharp curve) it splits but can still be recovered by gluing. I need the vertical strength for the violin ribs because it supports both the back and front plate. The secret of bending this Narra wood is boiling it directly in contact with water. If you just steam it… the hotness will not be enough as the steam quickly dries up and easily cool down and the narra regains its brittleness quickly as it dries. (I never tried heating it with the electric heaters.)

2) I underestimated the length of the C-clamp (not all but few). It was too close that I could not insert the rib plus the wood strip and the vinyl retainer into the jig and push block. The solution was to reduce the diameter of the crossing pin. You can visibly see at above picture that one c-clamp is using an old concrete drill as the pin. It works with a little bit of idea… GOOD THINKING.

3) Another point is the length of the wood strips. In this design, I have no idea how long it will be so I made the upper rib longer. This protrusion on the corners made it so difficult to insert the C-rib on the jig. There is no solution but to cut it in place. This has resulted into a damage to one corner (split and crack). THIS IS THE REASON why those professional VIOLIN maker uses a CORNER POST instead of the one above that the ribs are joint end to end at the corners. Well this is the challenge I got… Anyway, the violin I am restoring is a vintage one not the modern one.

LEARNING IDEA: The planning was there but none was done on the sketches. The available materials are just leftover and scrap… TARGET is to build a jig at low cost. PROTOTYPE must be considered as PRECIOUS as it will become the ORIGINAL. In my case, the idea was only to build what I need for the restoration of my violin, but in the later end… this become a promising jig for making more violins in the future. Notice that I used EBONY push blocks to have a lasting jig and I was rewarded by it with each strength… the bolts that pushes the blocks did not even punctured the ebony.

Another close up look on the C-rib clamped on the jig.

Here is the amazing way I discovered. The retainer piece I use is vinyl and boiled them together with the rib and the push block. Then while hot, just bend to shape. Quite easy. When the vinyl cools down, it retained the shape. I clamp the edges to the vise to prevent spring-back while cooling.

LEARNING IDEA: The point of thinking is boundless. Though this is small size, but a controlled automatic steamer or boiler like this is the one that will solve my bending problem. This is an actual sterilizer. This is were I dipped those ribs..

Meantime while the ribs are clamped and still wet, I have to regroove the plates…

I found that the grooves are dirty and I used the dremel to clean it and widen it a bit just enough to hold the ribs.
LESSON LEARNED: The error most of the time is trying to force something that will not fit. Cleaning and rework are the things that should be done rather than consuming a lot of time trying the impossible thing.

THE RIBS STAYED in form overnight… Early morning last Sunday, I was surprise of the success.. Took out the clams and here it is after fitting it to the plate..

FITTING BOTH PLATES without glue….Here are photos that I fitted both front and bottom plate….

SO, Until then. As you can see, nothing is unusual that I will back out from it. So life must go on
NEXT will be the hardest of them all….

Hope you enjoy this portion, as this is the real part of the game in making the ribs of the violin and the jig in action.

Have a nice day!

          079: PMA 2009: PMA 2009: Fuji Finepix Real 3D        
In this video, Alex investigates the Fuji Finepix Real 3D prototype with stereo lenses at PMA 2009.
          MPLAB IDE C programming µc project        

I noticed a difference between programming a simple C or C++ program compared to working with a project tree for example on mplab ide.
The project consists of many source files .c, and headers.h, there is one main.c.
All the c and c++ books teach to prototype and define functions outside the main, then call the function from inside the main and pass values or references or results back and force.
Now, on this hughe os c project, written for a particular µc the main is rather small, and has a lot of precompiler #if, but the real programming is done in those source files, consisting of functions over functions. I didnt see a single prototype of one. When I write a new function, no parameters are being passed when I for example call same function from within another function. Also when I declared a variable on top of a source file, assuming I would create a global variable, it is not working as one, I cant even read it`s value from within a function on the same source file.
The only way I was able to pass a value to other files was through a global declaration on a .h file. But with function parameters I am stuck. How can I pass them on the same source file to other functions, without even bother with the main?
Can someone help me to understand the basic functioning of the program structure as used by IDE projects for µC ?

Appreciate any help,

          Design of a logging system : which filesystem to use ?        
Hello everyone,

I am designing an embedded system whom goal is to log sensors state continuously and that can be powered off at any time.

For specific reasons, I have currently to use Windows XP, and a Compact Flash storage media. Firsts prototypes use this configuration :
* 1rst partition readonly for the system (NTFS),
* 2nd partition for data logging (NTFS).
First tests are not so bad, but I am not fully satisfied of this solution (time to time filesystem or Compact flash is dead).

Do you have advices about a way to improve the solution that I am validating ? Which filesystem to use for a flash storage media ?
Which solution would you use (HW is x86) if Windows XP and a compact flash were not a constraint (which storage media, which filesystem, which os configuration, etc.) ?

Thank you for your advices

          Infiniti EV Retro Prototype To Debut At The Pebble Beach        
Infiniti has announced a EV retro prototype will debut at the 2017 Pebble Beach Concours d’Elegance in California, which is held on August 15th to 20th.. We really haven’t heard so much of any electrification plans from Nissan’s luxury brand, since the Infiniti LE Concept in 2012. The earlier LE was basically a ‘luxury...
          iPhone 4 Confirmed to Have 512MB of RAM (Twice the iPad and 3GS)

Since the launch of the original iPhone, Apple has made efforts to hide some of the actual tech specs of the device from consumers. Apple has never advertised or even published the processor speed or amount of RAM found in the iPhone. Arguably, Apple is trying to shield customers from these technical distractions and instead trying to focus on overall functionality.

More technically inclined customers, however, still love to know what they have to work with. The original iPhone and iPhone 3G contained 128MB of RAM, while the iPhone 3GS was boosted to 256MB of RAM. Similarly, the iPad contains 256MB of RAM. This discrepancy offers a technical explanation why Apple is not supporting iOS 4 features such as multi-tasking on the original and 3G iPhone -- there simply isn't enough RAM.

We have since heard that the upcoming iPhone 4's RAM has again been upgraded. This will bring it to a total of 512MB of RAM, twice as much as the 3GS and iPad. This number actually contradicts tear down photos of a prototype iPhone 4 that was leaked to the internet. An analysis of that prototype device showed it only carried 256MB of RAM. The 512MB figure, however, does agree with a Digitimes report from May 17th that confirmed a 960x640 screen resolution, thinner display, and indeed 512MB of RAM. We had heard that Apple confirmed this 512MB figure during one of the WWDC sessions last week, and have now verified this. The session it was revealed in is now available (Session 147, Advanced Performance Optimization on iPhone OS, pt 2) for registered Apple developers.

The added RAM should help overall performance and multi-tasking on the new iPhone 4. It could also explain why the iMovie App that Apple introduced at WWDC will only run on the iPhone 4. Apple has said iMovie will only run on the iPhone 4 and not the 3GS or even the iPad. It seems conceivable that it actually makes use of the extra RAM offered by the iPhone 4.

          Giveaway and interview : Pretty Momma Beauty        
Hi everyone!

I have a wonderful surprise for you today, a beauty giveaway!
I met Ekatrina in the Etsy forum and she was very kind in offering to send one set of her amazing Raspberry Black Bath Tea. I was very curious about her product, especially when she mentioned it is real tea mixed with magnesium, salts. The tea bad holds the leaves together so it will be an easy clean up after a relaxing, skin rejuvenating and soothing bath.

RaspBerry Black Tea Bath Giveaway by Pretty Momma Beauty

But, as amazing are her products, I wanted you to learn more about this impressive lady behind the small natural shop of Pretty Momma Beauty.
Ladies and gentlemen, I present to you Ekatrina

Tell us about yourself and how all your Etsy adventure started :
I am a cell biology PhD student in the process of graduating and beginning my career as a professor. I started my first Etsy shop about halfway through my graduate career as a birthday present to myself. It has been a nice outlet for my creative energies in the midst of a demanding school load.

What pushed you to start your own beauty product line?
This shop began when I wanted to make bath salts for myself. I grew up in a home invested in natural and healthy products, so it was a short jump to combine that expertise with my scientific skills. The very first product I made was the mustard milk bath salts, which I formulated to combat muscle pain and tension. Once I got started, I just couldn’t stop, and I eventually opened this shop in order to share my products with a wider audience.

From the start, my products have been about providing tools to create a naturally healthy, beautiful you. Because of the demands on my time, I have been terrible about daily beauty upkeep. Once I started making my own products, I began to research and concoct products to help restore and repair my appearance. I formulated products that I found worked well, and so had to share those too. Because I need good value from the time I spend on myself, all my products are carefully formulated with high concentrations of active ingredients to maximize their effectiveness.

What inspires your collections?
My collections begin with an idea. For instance, I began my mud mask line in response to my search for something to reduce the appearance of my pores. Once I have an idea for a product type, my next question is “What other problems could this medium solve?”. So, in the mud mask example, I began expanding the concept to products that help maintain skin firmness, deal with inflammation, and provide deep-cleaning and detoxification benefits to help provide clear skin.

What is your process for your products?
When I start out to create a product, I begin by reading as much as I can about the process that produces the problem I want to solve and about how modern and traditional products go about addressing the problem. From that reading, I create a list of potential ingredients annotated with the mechanism of action of each. Then I research each ingredient intensively for safety and effectiveness, as well as to get an idea of which products would work well together. I create a prototype recipe designed to incorporate a diverse and synergistic ingredient list, then I go through repeated rounds of testing (on myself) and refinement for function and usability. The final product is what ends up in my shop. 

Are your products sold in boutiques or craft shows?
My primary outlet is online, although I have taken my products to a craft fair. I enjoyed the face to face interaction and direct feedback of the craft fair, so I hope to be able to do more of them after graduation. My products are not in a brick and mortar yet, but I am on the look-out for locations that would be a good fit. I’ve also been refining my packaging to reflect the professionalism and quality of my product, which is important in brick-and-mortar sales.

Can you tell us more about your Bath Tea Collection?
My bath tea collection is extensive, and primarily off-menu at this point. I have a wide variety of herbal and black teas, as well as a few more unusual options. My Roobios bath tea is especially refreshing, and my lavender mint bath tea is a favorite among returning customers. All of my bath teas are pre-packaged for easy clean-up and also contain magnesium salts to promote well-being. I will be working on bringing more of these teas into my regular offering once the craziness of my current transition period has passed. In the meantime, I am delighted to accommodate requests!

How long does it take you to develop a new serie of product?
The time it takes for me to develop a new product is directly proportional to how busy I am! :) Some products require minimal rounds of refinement and are ready to meet customers quickly. Other products, such as one particularly luscious conditioning treatment I have developed, require significant research to translate into a shelf-stable, easy-to-use product. (It’s been over a year since I made my first batch of that particular product, and it still isn’t ready to go to market. My hair, however, has greatly benefited from the research process.)

What’s next for PrettyMommaBeauty in 2015?
I have an extensive list of products to roll out in the next year. I have a very nice deep-conditioning moisturizer that is ready to go to market. It just needs final packaging decisions and photographs. I also want to elaborate on the idea and create a lighter moisturizer for those who live in warmer climates or who have more naturally moist skin. I also have an herbal detangling spray that has passed all my usage tests and needs to make its way into my shop. I have vegetable-based blush that will be ready once I standardize shades, and have initial recipes for complimenting lip products. I want to create liquid shampoos and bath gels from my bar soap recipes, which should be a fairly quick process once I obtain the necessary supplies. I also want to develop a magnesium-based deodorant, as I think that would work better than the salt crystal I currently use. What would you like to see in my shop??

If you weren’t making beauty products, what would you be making :
If I wasn’t running Etsy shops in my spare time, I’d be reading, embroidering, or tutoring. In addition to making bath and beauty products, I love doing graphic design and creating jewelry. I’m also an avid treasure-hunter at resale shops.

If you could live anywhere in the world...?
If I could live anywhere in the world….I don’t know. I like my current home, but would love to go someplace warmer in the winter. My favorite European country is Italy, but I want to visit China, and Spain, and Russia, and the Italian Alps, and see more of France… Home is where the heart is, but I love to learn about other cultures and meet new people too.

You can follow Ekatrina on Twitter here or on Tumblr
And, you can enter the giveaway at the top of the article! :)

          Ces machines qui seront autonomes avant l’automobile        
Lorsque l’on parle de systèmes de conduite autonome, on pense immédiatement à l’automobile : l’autopilote de Tesla, les prototypes Renault Next Two, les expérimentations d’Uber… Cependant, les voitures comptent parmi les appareils les plus difficiles à automatiser, étant donné leur rôle central dans nos sociétés et le nombre d’accrocs potentiels. Avant elles, de nombreuses machines seront capables […]
          Amon Carter Museum of American Art Presents "The Polaroid Project: At the Intersection of Art and Technology"         
Release date: 
March 20, 2017

FORT WORTH, Texas—The Amon Carter Museum of American Art presents The Polaroid Project: At the Intersection of Art and Technology, a sweeping overview of the Polaroid phenomenon featuring the work of more than 100 artist-photographers along with examples of the tools and artifacts that helped make Polaroid a household name. The exhibition, which makes its U.S. debut at the Amon Carter, highlights the wide-ranging and often surprising uses of Polaroid materials through more than 150 images, including works by such heralded artists as Ellen Carey (b. 1952), Chuck Close (b. 1940), Marie Cosindas (b. 1925), Barbara Crane (b. 1928), David Hockney (b. 1937), Robert Rauschenberg (1925–2008) and Andy Warhol (1928–1987). The exhibition is on view June 3 through September 3, 2017; admission is free.

“Polaroid was the epitome of instant imaging long before the digital age,” says Joy Jeehye Kim, Assistant Curator of Photographs. “This show reveals the energy of artists who embraced the technology as a novel medium of experimentation.”

Visitors to the exhibition will see how artists experimented in black-and-white and color and produced images ranging widely in size and shape from modest 3” x 4” portraits to large dream-like mosaics built from 20” x 24” prints. Pervading all is an atmosphere of constant experimentation and energetic play as artists interrogate and reimagine the very purpose of photography.

The Polaroid Project reveals the Polaroid Corporation’s technical and marketing roots, but it also showcases the company’s unusually strong and extensive commitment to art, starting with founder Edwin Land’s (1901–1991) close friendship with the great landscape photographer Ansel Adams (1902–1984), who often tested the Corporation’s new cameras and films in the mid-1950s and was an avid promoter of its achievements.

The Polaroid Corporation stood proudly at the forefront of photographic image-making in a world that had come to believe that easier and faster meant better. The company’s invention and production of finely designed, high-quality, yet easy-to-use cameras and films brought virtual immediacy to a medium that previously lived by the notion of “shoot and wait.” Land was a proud, ever-pushing genius who never hesitated to sound messianic in his pronouncements about Polaroid’s technical achievements. Like the romantic scientists of a hundred years earlier with whom he identified, Land believed his invention would not merely serve mankind in myriad ways, at work and in leisure, in the arts and the sciences, but even advance democracy.

During Polaroid’s prime, its cameras and films were purchased by millions of amateurs and countless professionals. While families recorded their anniversaries and graduation parties, filmmakers and fashion photographers made test shots, scientists recorded their observations, and police documented crime scenes, artists embraced the new medium as a grand new tool for image-making and experimentation. At the heart of it all was instantaneity—no longer did photographers have to send films to a lab, and wait for days or weeks, or even go into a darkroom for a laborious chemical process. With Polaroid’s instant range, photographer and subject could watch together as the image took form before their eyes. Polaroid not only transformed photography, it set the foundation for the expectation of immediate results we are so accustomed to today.

Supplementing the exhibition’s artworks are rare artifacts from the Polaroid Corporation archives that trace the development of the technology from Land’s early work with 3D photographs in military service during World War II and his initial development of instant cameras and film to the company’s famous SX-70 and Spectra cameras. In addition to presenting production models of both cameras and films, the exhibition also includes prototypes made of paper, plastic and wood that illuminate the creative puzzle-solving of the company scientists, engineers and technicians working through how to shift photography from fast to instant by way of elegant packaging. Together, the objects and photographs reveal how artists helped shape Polaroid even as Polaroid coaxed artists into exploring new ways of seeing and visually imagining the world.

“The exhibition drives home the company’s deep belief that art could and should be made anytime, anywhere, by anyone,” says Kim. “Polaroid, quite literally, taught the public not merely a new way of seeing, but a new way of relating to the world.”

Interactive cards that include information about the Polaroid Corporation’s cameras, film and history will be scattered throughout the exhibition for visitors of all ages to use. Large print labels will also be available for use in the galleries.

The Polaroid Project has been organized by the Foundation for the Exhibition of Photography, Minneapolis/New York/Paris/Lausanne, in collaboration with the MIT Museum, Cambridge, Mass., and the WestLicht Museum for Photography, Vienna. After the exhibition closes at the Amon Carter, it travels to the WestLicht Museum for Photography, where it is on view December 5, 2017 to March 4, 2018. In 2018 and 2019, the exhibition will also tour the C/O Berlin, Museum für Kunst und Gewerbe in Hamburg, McCord Museum in Montreal and MIT Museum.

In conjunction with the exhibition, the Amon Carter will sell a catalogue for $40 in the Museum Store.

Free Public Programs
Artist Talk by Ellen Carey
June 15, 6:30p.m.

Ellen Carey will discuss her experimental work with Polaroid from the 1970s to the present.

This program on American art, culture, and society is made possible by a generous gift from the late Anne Burnett Tandy.

Art Discovery: Oh Snap! Family Workshop
July 29, 10:30 a.m.

Families with children ages 7 to 12 can make with a photo-inspired art project. Reserve your spot beginning June 1.


          Acura teases new ARX-05 Prototype Race Car ahead of Monterey Automotive Week (+video)        
Acura teased its newest race car, the Acura ARX-05 prototype, to be fielded by the legendary Team Penske in 2018 Introduced in a new video ( the ARX-05 marks the Acura brand’s anticipated return to prototype racing and will be fully revealed at The Quail, A Motorsports Gathering, part of the prestigious Monterey Automotive Week, August 18. …

Continue reading »

          Does glassware really make a difference?        

So often when we go out to enjoy a few adult beverages, many bars and restaurants just pour them into the same thing: a shaker pint.  Named after its purpose, a shaker pint is what bartenders cap the shaker with to mix up a cocktail.  Most bars also use them to serve beer for one simple reason: they're easy to stack so they are cheaper from a space perspective.  Maybe if you're lucky, your bar has nonic pints.  Those are the ones with the bulge about 1/4 of the way down from the top.  But the question is, does it matter?  Recently, I had the opportunity to attend an event to find out.  To celebrate their recently installed 200BBL brewhouse, Abita Brewery invited some people in the homebrewing and beer blogging community to a first peek at the new setup and also to a glassware pairing conducted by Spiegelau.  In exchange for a nominal fee, attendees received a five piece tasting set and an Abita shaker pint (and the beer to compare).  First we toured the brewhouse, but that's not our focus here.  Afterwards, we all found a seat in the tap room for the real show for non-brewers.  The tasting was conducted by Chris Hillin, Regional Sales Manager for Riedel/Spiegelau, and Jaime Jurado, Director of Brewing Operations for Abita.  Now, on to the beer...

First up was Abita Amber:

This beer is not an American Amber.  It's probably closest to a German Vienna lager.  Only the color is amber.  For this one, we were asked to pour half in the shaker pint and half in the lager glass.  I was surprised by the difference.  Normally Amber is on my list of last resort beers like Sam Adams Boston Lager.  In other words, if my draft options are BudMillerCoors or Amber, that's the only time I'm getting an Amber.  The shaker pint was the reason why.  It was dull, uninteresting, and lifeless.  However, in the lager glass it actually showcased some of the complex maltiness of a Vienna.  The reasons are twofold.  Primarily, the shape of the glassware is designed around highlighting the strengths of certain beers.  The lager glass has a relative large bell tapered to collect and focus the complex maltiness (think caramel and toasted bread) of many European lagers.  Second, the type of glass they use allows them to make the glasses thinner without compromising structural integrity.  They then showed microscopic cross sections of most glassware and the type they use.  Ever wonder why eventually your glasses get cloudy after many cycles through the dishwasher?  Apparently there are tons of microscopic pits and valleys in most glasses.  Another apparent benefit to the type of glass that Spiegelau uses is that it did not display these same tiny 'flaws'.

Our next selection was S.O.S. (Save Our Shore):

This is an unfiltered Weizen Pils (yeah, that style doesn't exist anywhere).  It was brewed to raise funds for the Louisiana coastal protection efforts during the BP oil spill a few years ago.  For this one, we actually poured part into the lager glass and part into the wheat glass (which is similar in shape to a traditional pilsner glass).  Yet again I was surprised by how much difference the glasses made.  The lager glass was OK, but the wheat glass really made the balance of hops to wheat and malt pop.  The process used to pair the beers and glasses is somewhat unscientific.  They basically get their tasters (or a combination of theirs and a brewery's if they're partnering) to try a beer in several different prototypes and then revamp until they find one that really makes a certain style shine.

And then came Spiegelau's claim to fame, the IPA glass and Wrought Iron IPA:

Abita has attempted to make it into the IPA market quite a few times.  So far, the results have been mixed for me.  Now, there's their new Wrought Iron IPA.  I had it on draft in a shaker pint a few days before the event, and although the nose on it was pretty good (Mosaic hops had lots of berry notes), the flavor was harsh grapefruit and disappointing.  In the IPA glass, though, it was actually pretty good.  The citrus, pine, and berry (with a touch of muskiness) were far more balanced.  The taste was also much less harsh.  In the shaker, it was exactly how I remembered it.  Designed with Dogfish Head and Sierra Nevada, the IPA glass has a few things going on.  The large bell with a relatively small tapered top focuses the hop aromas and keeps them around.  The wavy bottom actually serves to enhance the experience by creating nucleation points to release more carbonation when you get towards the halfway point in the glass.  When you tilt it to sip, you create more foam and release more aromas when you put it back down.  I'm getting sold on the concept by this point.

Next up was the stemmed tulip and Abbey Ale:

This is Abita's take on a Belgian dubbel.  This is actually where they sold me on the glassware concept.  I've had this before and it was pretty decent, but in the stemmed tulip, it was downright sublime.  My wife even commented that we needed to pick some up next time we had a chance.  The sweet breadiness of the malt and the banana and light clove from the yeast were showcased by the shape of the glass.  This one is made for most big malt bombs.  Think Belgians, Scotch Ales, Imperial Stouts and Porters, etc.  More on this in a bit...

The final official pairing was the relatively new stout glass and Naughty Quaker:

Naughty Quaker is an oatmeal stout.  It's part of Abita's Select Series which means it's typically draft only and rarely makes it out of Louisiana.  This is unfortunate, because most of Abita's really great beers have fallen into this series while the rest of the world only gets Amber, Purple Haze, and the like.  The stout glass is similar in some ways to the IPA glass except squatter and without the wavy nucleation points.  This tends to focus the roasty character of dark malt the most.  At this point, we weren't comparing anything to the shaker pint, but I had a glass in one when we stopped for supper on the way home and the glass truly does make a difference.  I generally don't like shaker pints anyway, but now I'm downright spoiled against them.

This was the end of the official tasting, but I had a pop quiz of sorts prepared.  I went up to Chris and asked him what glass he thought might pair best with a wood-aged Scotch ale.  He said it was probably a tossup between the stemmed tulip and the stout glass.  I told him this was the opportune time to test it since I had brought along a mini-growler of mine.  This beer is a strong Scotch ale which Jaime termed a "way heavy" after tasting it.  I took my standard Scotch ale recipe and aged it on a combination of light and dark toasted oak chips soaked in Macallan 15.  The clear winner was the stemmed tulip.  It really allowed all aspects of the beer to shine.  The dark fruit (raisins, plums, figs, and black cherries) combined with the vanilla and a hint of leather and tobacco from the wood were pretty amazing, if I do say so myself.  The stout glass really only showcased the wood: all barrel and no fruit.  In conclusion, glassware does make a difference in my opinion.  I sort of wish they made glassware suited for beer judging now.  Most competitions are based on the aromas and flavors you get from "airline cups", those ubiquitous cups that you seem to only see at homebrew competitions and on an airplane.  Scaled down tasters more suited to the styles would be a worthy investment for me.

          Is Trump really like Andrew Jackson?        
Andrew Jackson is getting a lot of attention lately, none of it favorable.  Meanwhile, President Trump, while rather vague on certain details of American history, has expressed admiration for him.  And many commentators have argued that they are, in fact, similar in important ways.  All this is hard for me to assimilate, because when I was growing up, Andrew Jackson was something of a liberal hero, if not quite of the stature of Jefferson or Lincoln or FDR.  He believed in more direct democracy, he hated financial privilege, he was supported by a coalition of workers and farmers, and Arthur Schlesinger Jr. had specifically painted him as a kind of prototype for FDR.  Now, of course, we are paying more attention to Jackson's status as a slave owner, and his involvement in the removal of Indian tribes to the west of the Missisippi.  I decided to spend a few minutes to try to rediscover who Jackson actually was--with particular reference to the question of whether he in fact had anythng in common with Donald Trump.

Neither time nor space permits an exhaustive examination of this question, but it didn't take long to find some interesting excerpts in his lengthy, careful annual messages to Congress.  This one comes from his first, in December 1829--and calls for direct popular election of the President! Here are Jackson's words.

"To the people belongs the right of electing their Chief Magistrate; it was never designed that their choice should in any case be defeated, either by the intervention of electoral colleges or by the agency confided, under certain contingencies, to the House of Representatives. Experience proves that in proportion as agents to execute the will of the people are multiplied there is danger of their wishes being frustrated. Some may be unfaithful; all are liable to err. So far, therefore, as the people can with convenience speak, it is safer for them to express their own will.

"The number of aspirants to the Presidency and the diversity of the interests which may influence their claims leave little reason to expect a choice in the first instance, and in that event the election must devolve on the House of Representatives, where it is obvious the will of the people may not be always ascertained, or, if ascertained, may not be regarded. From the mode of voting by States the choice is to be made by 24 votes, and it may often occur that one of these will be controlled by an individual Representative. Honors and offices are at the disposal of the successful candidate. Repeated ballotings may make it apparent that a single individual holds the cast in his hand. May he not be tempted to name his reward? , , ,

" I would therefore recommend such an amendment of the Constitution as may remove all intermediate agency in the election of the President and Vice-President. The mode may be so regulated as to preserve to each State its present relative weight in the election, and a failure in the first attempt may be provided for by confining the second to a choice between the two highest candidates. In connection with such an amendment it would seem advisable to limit the service of the Chief Magistrate to a single term of either 4 or 6 years. If, however, it should not be adopted, it is worthy of consideration whether a provision disqualifying for office the Representatives in Congress on whom such an election may have devolved would not be proper."

The abolition of the electoral college has become a favorite liberal demand, all the more so because Jackson's proposal, had it been embodied in the Constitution, would have kept both George W. Bush and Donald Trump out of the White House.  I don't have time to find out exactly how and why Jackson's proposal failed of adoption, but it appears to mark him as a genuine champion of the people's rule, albeit, of course, within the framework of his time, in which women were not allowed to vote and slavery still existed in 15 states.  There is, however, another aspect to this proposal, which casts it in a different light.

Jackson was in effect complaining that he was only in his first year in the White House instead of his fifth.  The party system had broken down in 1824 and he had run for President against three other candidates from the Democratic Party: William Crawford, John Quincy Adams, and Henry Clay.  Jackson had won the popular vote handily, but he had not won a majority in the electoral college and the election had gone to the House of Representatives.  There he had been bested by Adams, to whom Clay had thrown his support.  Then Adams made the great political blunder of his career by naming Clay Secretary of States, and cries of "corrupt bargain!" rang through the land.  Rather than tweeting that he had been the real winner, Jackson was more discreetly referring to these events in his address.  He may have been a sincere Democrat--but he could also hold a grudge.  Many years later, in retirement, he reportedly said that he had only two regrets--that he had never been able to shoot Henry Clay, or to hang John C. Calhoun.

A year later, in December 1829, Jackson commented on the quick, nearly bloodless revolution that had replaced the conservative Bourbon monarchy in France with the more liberal and constitutional rule of Louis Philippe.  He put this development in the context of world history, in which the United States was now playing a key role,

"The important modifications of their Government, effected with so much courage and wisdom by the people of France, afford a happy presage of their future course, and have naturally elicited from the kindred feelings of this nation that spontaneous and universal burst of applause in which you have participated. In congratulating you, my fellow citizens, upon an event so auspicious to the dearest interests of man- kind I do no more than respond to the voice of my country, without transcending in the slightest degree that salutary maxim of the illustrious Washington which enjoins an abstinence from all interference with the internal affairs of other nations. From a people exercising in the most unlimited degree the right of self-government, and enjoying, as derived from this proud characteristic, under the favor of Heaven, much of the happiness with which they are blessed; a people who can point in triumph to their free institutions and challenge comparison with the fruits they bear, as well as with the moderation, intelligence, and energy with which they are administered -- from such a people the deepest sympathy was to be expected in a struggle for the sacred principles of liberty, conducted in a spirit every way worthy of the cause, and crowned by a heroic moderation which has disarmed revolution of its terrors. Not withstanding the strong assurances which the man whom we so sincerely love and justly admire [I do not know to whom this referred] has given to the world of the high character of the present King of the French, and which if sustained to the end will secure to him the proud appellation of Patriot King, it is not in his success, but in that of the great principle which has borne him to the throne -- the paramount authority of the public will -- that the American people rejoice."

On the eve of his death only four years earlier, Jefferson had reiterated the hope that liberty, as expressed in the Declaration of Independence, would come to the whole world.  Jackson's remarks, praising the French step down this path, were in this tradition.  Two years later Britain also took a small step towards popular rule, when the Reform Act of 1832 became law.  Today our President is also praising a worldwide political trend--but this time the trend is towards authoritarianism, not towards democracy.  The President's long-standing admiration for Vladimir Putin is well known, but in recent weeks he has congratulated the Turkish President Erdogan on a vote that gave him even more power, invited the murderous President Duterte of the Philippines to Washington, and offered to meet with Kim Jong Un.  His Administration shows signs of becoming the first American administration specifically to endorse a trend towards authoritarianism--the opposite of what Jackson and other 19th century Presidents did

In the same message Jackson mentioned that the government had had to put down a rebellion, or independence movement, among the Choctaw and Chickasaw tribes in Alabama and Mississippi, and endorsed their removal to Indian territory in what is now Oklahoma.  But he made no attempt to conceal the hardship involved in these measures, while trying to put them in historical context.

"Humanity has often wept over the fate of the aborigines of this country, and Philanthropy has been long busily employed in devising means to avert it, but its progress has never for a moment been arrested, and one by one have many powerful tribes disappeared from the earth. To follow to the tomb the last of his race and to tread on the graves of extinct nations excite melancholy reflections. But true philanthropy reconciles the mind to these vicissitudes as it does to the extinction of one generation to make room for another. In the monuments and fortifications of an unknown people, spread over the extensive regions of the West, we behold the memorials of a once powerful race, which was exterminated or has disappeared to make room for the existing savage tribes. [He appears to be referring here to the Mound Builders.] Nor is there any thing in this which, upon a comprehensive view of the general interests of the human race, is to be regretted. Philanthropy could not wish to see this continent restored to the condition in which it was found by our forefathers. What good man would prefer a country covered with forests and ranged by a few thousand savages to our extensive Republic, studded with cities, towns, and prosperous farms, embellished with all the improvements which art can devise or industry execute, occupied by more than 12,000,000 happy people, and filled with all the blessings of liberty, civilization, and religion?

"The present policy of the Government is but a continuation of the same progressive change by a milder process. The tribes which occupied the countries now constituting the Eastern States were annihilated or have melted away to make room for the whites. The waves of population and civilization are rolling to the westward, and we now propose to acquire the countries occupied by the red men of the South and West by a fair exchange, and, at the expense of the United States, to send them to a land where their existence may be prolonged and perhaps made perpetual.
Doubtless it will be painful to leave the graves of their fathers; but what do they more than our ancestors did or than our children are now doing? To better their condition in an unknown land our forefathers left all that was dear in earthly objects. Our children by thousands yearly leave the land of their birth to seek new homes in distant regions. Does Humanity weep at these painful separations from every thing, animate and inanimate, with which the young heart has become entwined? Far from it. It is rather a source of joy that our country affords scope where our young population may range unconstrained in body or in mind, developing the power and faculties of man in their highest perfection."

Today, our universities have for decades been preoccupied with the faults of western civilization and the injuries that has inflicted upon other regions of the world, with the implication that history's course should certainly be held in place, if not reversed.  And a great many Americans have come to regard their nation's founding and growth as a crime.  I would suggest that it was almost impossible for an American of Jackson's age (born in 1767) to hold that view. They had experienced the Declaration of Independence, the Constitution, the Louisiana Purchase, and the formation of many new states.  They saw all this as a great human experiment in which they were the leading actors. And when Jackson pointed out that Indian civilizations had warred against one another even to the point of extinction before the arrival of the Europeans, he was only speaking the truth. I shall let my readers make their own judgments about Jackson's words and actions, and how they fit into the whole history of the United States.  But I do think today's US citizens might ask themselves if they truly repudiate what our ancestors did in creating the United States as it now is--keeping in mind that so many of us, white, black, brown and yellow, would never have existed had they not done so, since our ancestors would have been so unlikely to have met elsewhere.

I turn now to Jackson's most famous state paper, his veto of the renewal of the charter of the Bank of the United States in July 1832.  The Bank enjoyed special privileges under the law that created it which turned it into the equivalent of a European central bank, and Jackson complained that it had used those privileges to accumulate enormous power over the banking system, and enormous wealth at the expense of ordinary Americans.  He continued:

"It is to be regretted that the rich and powerful too often bend the acts of government to their selfish purposes. Distinctions in society will always exist under every just government. Equality of talents, of education, or of wealth can not be produced by human institutions. In the full enjoyment of the gifts of Heaven and the fruits of superior industry, economy, and virtue, every man is equally entitled to protection by law; but when the laws undertake to add to these natural and just advantages artificial distinctions, to grant titles, gratuities, and exclusive privileges, to make the rich richer and the potent more powerful, the humble members of society--the farmers, mechanics, and laborers--who have neither the time nor the means of securing like favors to themselves, have a right to complain of the injustice of their Government. There are no necessary evils in government. Its evils exist only in its abuses. If it would confine itself to equal protection, and, as Heaven does its rains, shower its favors alike on the high and the low, the rich and the poor, it would be an unqualified blessing. In the act before me there seems to be a wide and unnecessary departure from these just principles."

It was this message, more than anything else, that established Jackson as the heir to the tradition of both political and economic democracy that was begun by Jefferson and elaborated upon by Wilson,  Franklin Roosevelt, and Lyndon Johnson in the twentieth century.  Today that tradition survives in Bernie Sanders and Elizabeth Warren--but they represent only one wing of the Democratic Party.  Donald Trump, needless to say, is completely outside that tradition and he and the Republicans in Congress want to destroy it.

It will not have escaped the reader's attention, meanwhile, that Andrew Jackson possessed a command of the English language of which Donald Trump never dreamed, and that he took his duties as President of the world's leading republic with a seriousness of which Trump would never be capable.  It has become fashionable to judge historical figures according to simple, binary moral standards, in which acts that even recognize, much less further, racism or sexism automatically mark men as evil.  I have attempted to suggest that Andrew Jackson is one of many figures from our history to whom these rules do less than justice.  And I have attempted to show clearly that any similarities between Trump and Andrew Jackson are far outweighed by enormous differences of political outlook and goals.

          Chip Foose Named Grand Marshal for Race at Laguna Seca        
Legendary designer Chip Foose was named Grand Marshal for the May 17th Grand-Am Rolex Series race at Mazada Raceway Laguna Seca. Award-winning automotive designer Chip Foose has been named Grand Marshal of the May 17 Verizon Festival of Speed presented by SPEEDCOM at Mazda Raceway Laguna Seca, Round 4 of the 2009 Grand-Am Rolex Sports Car Series presented by Crown Royal Cask No. 16 schedule. In addition to giving the command to start the engines for the Daytona Prototype and GT class driv...
          Envisioning The Future Of Sports Gaming        
Video prototypes what the future of sports gaming will look like for console enthusiasts.
          A bit of History        
For those wondering, here's the history of JPype :

I always have a lot of projects going on. And in many cases, while I would prefer to use Python to implement them, requirements and/or convenience often steers me toward Java. Let's face it, when it comes to community mindshare, Python is no slouch, but Java definitely is the 500 lbs. gorilla.

But I really wanted to use Python, so I looked around to see how easy it was to mix the two. Jython (JPython at the time) was not an option because of general slowness and lack of feature support. I failed to successfully build the only python/java integration library I found. So I decided to build mine. That was back in may of 2004.

The initial versions (0.1 to 04) were more or less of prototype quality. The C++ code was extensive, with lots of Python extension type and lots of problems making Java classes behave like python classes. Java-specific code and Python-specific code were hopelessly locked together.

0.5 was a complete rewrite, with an eye towards separating the bridging code. Although the amount of C++ code didn't shrink, this saw the introduction of real, dynamically create, Python classes. No more trying to make extension types behave like regular python classes. This was almost perfect.

Major limitations include the inability to raise/except with straight java exception classes (needs to use the member PYEXC instead), and the inability to cleanly shutdown/restart a JVM.

JPype got it's first real test when Chas Emerick of Snowtide Informatics ( contacted me about polishing JPype for use in one of their product. I can honestly say the partnership has greatly benefited JPype, with all the improvements made then folded back into the code.

The release of 0.5 has been followed by a lengthy pause in development. Lack of time and interest in other issues being the major reasons. Now time has come to come to resume work towards that almost mythical 1.0 release. 0.6 will be out sometime in the coming months. The details of this, however, will have to be the subject of another post ...

Read back for more info later on.
          Blog Post: Near Perfection        

I honestly thought this game was amazing.I have had it since day one!People complain that in the Single Player you die too quickly as a Big Daddy and honestly,you kinda are supposed too.Since you are the "Prototype Big Daddy" your armor is a lot thinner,especially since it was before the civil war broke out.There was no need for thick armor.Now for the Multiplayer.I honestly thought it was a decent experiece,however everything going for it got taken away by lag.However,further way down the road,2K made an update to fix the game.It worked for the most part.The DLC for this game is great as well.Rapture Metro is an amazin map pack if any.You six maps,a new game mode,and 150 new gamerscore for 800 Microsoft Points($10), compared to MW2's five maps for 1200 Microsoft Points($15).Overall,this is one of the best games I have ever played and if people more time to understand they more people would feel the same.

          You'll Never Believe the Big Hairy Audacious Startup John Jacob Astor Created in 1808        


Think your startup has a Big Hairy Audacious Goal? Along with President Thomas Jefferson, John Jacob Astor  conceived (in 1808), and implemented (in 1810) plan to funnel the entire tradable wealth of the westernmost sector of the North American continent north of Mexico through his own hands. Early accounts described it as “the largest commercial enterprise the world has ever known.”

Think your startup raised a lot of money? Astor put up $400,000 ($7,614,486 in today's dollars) of his own money, with more committed after the first prototype succeeded.

Think competition is new? John Jacob Astor dealt with rivals in one of three ways: he tried to buy them out; if that didn’t work, he tried to partner with them; if he failed to join them, he tried to crush them.

Think your startup requires commitment? Joining Astor required pledging five years of one’s life to a start-up venture bound for the unknownn.

Think your startup works hard? Voyageur's paddled twelve to fifteen hours per day, with short breaks while afloat for a pipe of tobacco. During that single day each voyageur would make more than thirty thousand paddle strokes. On the upper Great Lakes, the canoes traversed hundreds of miles of empty, forested shorelines and vast stretches of clear water without ports or settlements or sails, except for the scattered Indian encampment.

Think your product is complex? Astor planned, manned and outfitted one overseas and two overland expeditions to build the equivalent of a Jamestown settlement on the Pacific Coast.

Think your startup parties hard? Every nook and corner in the whole island swarmed, at all hours of the day and night, with motley groups of uproarious tipplers and whisky-hunters. It resembled a great bedlam, the frantic inmates running to and fro in wild forgetfulness. Many were eager for company and with a yen to cut loose—drinking, dancing, singing, whoring, fighting, buying knickknacks and finery from the beach’s shacks and stalls. 

Think your startup was an adventure you can never forget? I have been twenty-four years a canoe man, and forty-one years in service; no portage was ever too long for me. Fifty songs could I sing. I have saved the lives of ten voyageurs. Have had twelve wives and six running dogs. I spent all my money in pleasure. Were I young again, I should spend my life the same way over. There is no life so happy as a voyageur’s life!

Think people at your startup dress weird? Above the waist, the voyageurs wore a loose-fitting and colorful plaid shirt, perhaps a blue or red, and over it, depending on the weather, a long, hooded, capelike coat called a capote. In cold winds they cinched this closed with a waist sash—the gaudier the better, often red. From the striking sash dangled a beaded pouch that contained their fire-making materials and tobacco for their “inevitable pipe.”...The true “Man of the North” wore a brightly colored feather in his cap to distinguish himself from the rabble.

Think your startup takes risks? Half of them died.

And like most startups, they accomplished a lot, but ultimately failed to earn a payout.

Thomas Jefferson said to John Jacob Astor: Your name will be handed down with that of Columbus & Raleigh, as the father of the establishment and the founder of such an empire. Unfortunately, not so much Tom. How many have heard of Astor today? Not many, unless you've traveled to Astoria, Oregon. Astoria in the right weather is a gorgeous place with a hot beer scene.

It's trite to say the reward is in the journey, but in this case the saying is true, the journey was larger than digital life.

For the complete story read: Astoria: John Jacob Astor and Thomas Jefferson's Lost Pacific Empire: A Story of Wealth, Ambition, and Survival.

          Stuff The Internet Says On Scalability For July 14th, 2017        

Hey, it's HighScalability time:



We've seen algorithms expressed in seeds. Here's an algorithm for taking birth control pills expressed as packaging. Awesome history on 99% Invisible.

If you like this sort of Stuff then please support me on Patreon.


  • 2 trillion: web requests served daily by Akamai; 9 billion: farthest star ever seen in light-years; 10^31: bacteriophages on earth; 7: peers needed to repair ransomware damage; $30,000: threshold of when to leave AWS; $300K-$400K: beginning cost of running Azure Stack on HPE ProLiant; 3.5M: files in the Microsoft's git repository; 300M: Google's internal image data training set size; 7.2 Mbps: global average connection speed; 85 million: Amazon Prime members; 35%: Germany generated its electricity from renewables;

  • Quotable Quotes:
    • Jessica Flack: I believe that science sits at the intersection of these three things — the data, the discussions and the math. It is that triangulation — that’s what science is. And true understanding, if there is such a thing, comes only when we can do the translation between these three ways of representing the world.
    • gonchs: “If your whole business relies on us [Medium], you might want to pick a different one”
    • @AaronBBrown777: Hey @kelseyhightower, if you're surfing GitHub today, you might find it interesting that all your web bits come thru Kubernetes as of today.
    • Psyblog: The researchers were surprised to find that a more rebellious childhood nature was associated with a higher adult income.
    • Antoine de Saint-Exupéry: If you want to build a ship, don't drum up people to collect wood and don't assign them tasks and work, but rather teach them to long for the endless immensity of the sea.
    • Marek Kirejczyk: In general I would say: if you need to debug — you’ve already lost your way.
    • jasondc: To put it another way, RethinkDB did extremely well on Hacker News. Twitter didn't, if you remember all the negative posts (and still went public). There is little relation between success on Hacker News and company success.
    • Rory Sutherland: What intrigues me about human decision making is that there seems to be a path-dependence involved - to which we are completely blind.
    • joeblau: That experience taught me that you really need to understand what you're trying to solve before picking a database. Mongo is great for some things and terrible for others. Knowing what I know now, I would have probably chosen Kafka.
    • 0xbear: cloud "cores" are actually hyperthreads. Cloud GPUs are single dies on multi-die card. If you use GPUs 24x7, just buy a few 1080 Ti cards and forego the cloud entirely. If you must use TF in cloud with CPU, compile it yourself with AVX2 and FMA support. Stock TF is compiled for the lowest common denominator
    • Dissolving the Fermi Paradox: Doing a distribution model shows that even existing literature allows for a substantial probability of very little life, and a more cautious prior gives a significant probability for rare life
    • Peter Stark: Crews with clique structures report significantly more depression, anxiety, anger, fatigue and confusion than crews with core-periphery structures.
    • Patrick Marshall: Gu said that the team expects to have a prototype [S2OS’s software-defined hypervisor is being designed to centrally manage networking, storage and computing resources] ready in about three years that will be available as open-source software.
    • cobookman: I've been amazed that more people don't make use of googles preemtibles. Not only are they great for background batch compute. You can also use them for cutting your stateless webserver compute costs down. I've seen some people use k8s with a cluster of preemtibles and non preemtibles. 
    • @jeffsussna: Complex systems can’t be fully modeled. Failure becomes the only way to fully discover requirements. Thus the need to embrace it.
    • Jennifer Doudna: a genome’s size is not an accurate predictor of an organism’s complexity; the human genome is roughly the same length as a mouse or frog genome, about ten times smaller than the salamander genome, and more than one hundred times smaller than some plant genomes.
    • Daniel C. Dennett: In Darwin’s Dangerous Idea (1995), I argued that natural selection is an algorithmic process, a collection of sorting algorithms that are themselves composed of generate-and-test algorithms that exploit randomness (pseudo-randomness, chaos) in the generation phase, and some sort of mindless quality-control testing phase, with the winners advancing in the tournament by having more offspring.
    • Almir Mustafic: My team learned the DynamoDB limitations before we went to production and we spent time calculating things to properly provision RCUs and WCUs. We are running fine in production now and I hear that there will be automatic DynamoDB scaling soon. In the meantime, we have a custom Python script that scales our DynamoDB.

  • I've written a novella: The Strange Trial of Ciri: The First Sentient AI. It explores the idea of how a sentient AI might arise as ripped from the headlines deep learning techniques are applied to large social networks. I try to be realistic with the technology. There's some hand waving, but I stay true to the programmers perspective on things. One of the big philosophical questions is how do you even know when an AI is sentient? What does sentience mean? So there's a trial to settle the matter. Maybe. The big question: would an AI accept the verdict of a human trial? Or would it fight for its life? When an AI becomes sentient what would it want to do with its life? Those are the tensions in the story. I consider it hard scifi, but if you like LitRPG there's a dash of that thrown in as well. Anyway, I like the story. If you do too please consider giving it a review on Amazon. Thanks for your support!

  • Serving 39 Million Requests for $370/Month, or: How We Reduced Our Hosting Costs by Two Orders of Magnitude. Step 1: Just Go Serverless: Simply moving to a serverless environment had the single greatest impact on reducing hosting costs. Our extremely expensive operating costs immediately shrunk by two orders of magnitude. Step 2: Lower Your Memory Allocation: Remember, each time you halve your function’s memory allocation, you’re roughly halving your Lambda costs. Step 3: Cache Your API Gateway Responses: We pay around $14 a month for a 0.5GB API Gateway cache with a 1 hour TTL. In the last month, 52% (20.3MM out of 39MM) of our API requests were served from the cache, meaning less than half (18.7MM requests) required invoking our Lambda function. That $14 saves us around $240 a month in Lambda costs.

Don't miss all that the Internet has to say on Scalability, click below and become eventually consistent with all scalability knowledge (which means this post has many more items to read so please keep on reading)...

          Migrating a live server to another host with no downtime        

I have had a 1U server co-located for some time now at iWeb Technologies' datacenter in Montreal. So far I've had no issues and it did a wonderful job hosting websites & a few other VMs, but because of my concern for its aging hardware I wanted to migrate away before disaster struck.

Modern VPS offerings are a steal in terms of they performance they offer for the price, and Linode's 4096 plan caught my eye at a nice sweet spot. Backed by powerful CPUs and SSD storage, their VPS is blazingly fast and the only downside is I would lose some RAM and HDD-backed storage compared to my 1U server. The bandwidth provided wit the Linode was also a nice bump up from my previous 10Mbps, 500GB/mo traffic limit.

When CentOS 7 was released I took the opportunity to immediately start modernizing my CentOS 5 configuration and test its configuration. I wanted to ensure full continuity for client-facing services - other than a nice speed boost, I wanted clients to take no manual action on their end to reconfigure their devices or domains.

I also wanted to ensure zero downtime. As the DNS A records are being migrated, I didn't want emails coming in to the wrong server (or clients checking a stale inboxes until they started seeing the new mailserver IP). I can easily configure Postfix to relay all incoming mail on the CentOS 5 server to the IP of the CentOS 7 one to avoid any loss of emails, but there's still the issue that some end users might connect to the old server and get served their old IMAP inbox for some time.

So first things first, after developing a prototype VM that offered the same service set I went about buying a small Linode for a month to test the configuration some of my existing user data from my CentOS 5 server. MySQL was sufficiently easy to migrate over and Dovecot was able to preserve all UUIDs, so my inbox continued to sync seamlessly. Apache complained a bit when importing my virtual host configurations due to the new 2.4 syntax, but nothing a few sed commands couldn't fix. So with full continuity out of the way, I had to develop a strategy to handle zero downtime.

With some foresight and DNS TTL adjustments, we can get near zero downtime assuming all resolvers comply with your TTL. Simply set your TTL to 300 (5 minutes) a day or so before the migration occurs and as your old TTL expires, resolvers will see the new TTL and will not cache the IP for as long. Even with a short TTL, that's still up to 5 minutes of downtime and clients often do bad things... The IP might still be cached (e.g. at the ISP, router, OS, or browser) for longer. Ultimately, I'm the one that ends up looking bad in that scenario even though I have done what I can on the server side and have no ability to fix the broken clients.

To work around this, I discovered an incredibly handy tool socat that can make magic happen. socat routes data between sockets, network connections, files, pipes, you name it. Installing it is as easy as: yum install socat

A quick script later and we can forward all connections from the old host to the new host:


# Stop services on this host
for SERVICE in dovecot postfix httpd mysqld;do
  /sbin/service $SERVICE stop

# Some cleanup
rm /var/lib/mysql/mysql.sock

# Map the new server's MySQL to localhost:3307
# Assumes capability for password-less (e.g. pubkey) login
ssh $NEWIP -L 3307:localhost:3306 &
socat unix-listen:/var/lib/mysql/mysql.sock,fork,reuseaddr,unlink-early,unlink-close,user=mysql,group=mysql,mode=777 TCP:localhost:3307 &

# Map ports from each service to the new host
for PORT in 110 995 143 993 25 465 587 80 3306;do
  echo "Starting socat on port $PORT..."
  socat TCP-LISTEN:$PORT,fork TCP:${NEWIP}:${PORT} &
  sleep 1

And just like that, every connection made to the old server is immediately forwarded to the new one. This includes the MySQL socket (which is automatically used instead of a TCP connection a host of 'localhost' is passed to MySQL).

Note how we establish a SSH tunnel mapping a connection to localhost:3306 on the new server to port 3307 on the old one instead of simply forwarding the connection and socket to the new server - this is done so that if you have users who are permitted on 'localhost' only, they can still connect (forwarding the connection will deny access due to a connection from a unauthorized remote host).

Update: a friend has pointed out this video to me, if you thought 0 downtime was bad enough... These guys move a live server 7km through public transport without losing power or network!

          Adidas Originals Tubular Doom PK Pale Nude Brown Shoe BB2390        
Adidas Originals Tubular Doom PK Pale Nude Brown Shoe BB2390

Adidas Originals Tubular Doom PK Pale Nude Brown Shoe BB2390

A futuristic Tubular with a sock-like wool upper The original Tubular was released as a running shoe in 1994, but the modern-day version is based on the prototypes that were too far ahead of their time Today's Tubular brings what was impossible in the '90s to life The shoes' outsole is inspired by the inner tube of inflatable tyres This pair of Tubular shoes features a soft, undyed Argentinian leather heel cage with star perforations Built in an ultra-comfy mélange wool adidas Primeknit upper and finished with reflective laces Adidas Primeknit upper wraps the foot in adaptive support and ultralight comfort Mélange wool sock with soft, undyed Argentinian leather overlays Reflective laces; Sock-like construction for a snug fit EVA midsole for lightweight cushioning Star perforations on leather heel cage Tubular EVA outsole Material: Primeknit upper Rubber sole  

          Adidas Tubular Instinct Boost Black Shoe BB8401        
Adidas Tubular Instinct Boost Black Shoe BB8401

Adidas Tubular Instinct Boost Black Shoe BB8401

A hi-cut Tubular with an undyed Argentinian leather upper The original Tubular was released as a running shoe in 1994, but the modern-day version is based on the prototypes that were too far ahead of their time Today's Tubular makes possible what was impossible in the '90s The shoes' outsole is inspired by the inner tube of inflatable tyres These Tubular shoes are built with energising boost™ in the midsole They feature a soft, undyed Argentinian leather upper with star-shaped perforations They have a hi-cut look and reflective laces Boost™ is Adidas most responsive cushioning ever: The more energy you give, the more you get Undyed Argentinian leather upper with star perforations Full grain leather lining Reflective laces Semi-translucent rubber outsole Modified Tubular-style rubber outsole Material: Leather upper Rubber sole  

          Adidas Tubular Instinct Boost Chalk White Beige Shoe BB8400        
Adidas Tubular Instinct Boost Chalk White Beige Shoe BB8400

Adidas Tubular Instinct Boost Chalk White Beige Shoe BB8400

A hi-cut Tubular with an undyed Argentinian leather upper The original Tubular was released as a running shoe in 1994, but the modern-day version is based on the prototypes that were too far ahead of their time Today's Tubular makes possible what was impossible in the '90s The shoes' outsole is inspired by the inner tube of inflatable tyres These Tubular shoes are built with energising boost™ in the midsole They feature a soft, undyed Argentinian leather upper with star-shaped perforations They have a hi-cut look and reflective laces Boost™ is Adidas most responsive cushioning ever: The more energy you give, the more you get Undyed Argentinian leather upper with star perforations Full grain leather lining Reflective laces Semi-translucent rubber outsole Modified Tubular-style rubber outsole Material: Leather upper Rubber sole  

          VSR700 demonstrator performs 1st autonomous flights        
Airbus Helicopters recently started autonomous flight trials of a VSR700 Optionally Piloted Vehicle (OPV) demonstrator, paving the way for a first flight of the actual VSR700 prototype in 2018. A light military rotary-wing tactical unmanned aerial vehicle, the VSR700 is being developed jointly by Airbus Helicopters and Helicopteres Guimbal, the original manufacturer of the civil-certified Cabri G2 helicopter from which the VSR700 is derived. "We are pleased to have achieved this milestone onl...
          Le retour du CR-100        
Le retour sur le marché du kit de la gamme des avions de voltige CR. Le CR-100, c’est le biplace de voltige, côte-à-côte, produit en kit par Dyn’Aéro sous la direction de Christophe Robin à Dijon-Darois. Le prototype a effectué son premier vol le 27 août 1992, peu avant l’officialisation de Dynaéro créée en octobre […]
          Ivan Henriques        
The development of this prototype is part of the evolution of bio-machines previously built by artist Ivan Henriques (°1978, Brazil), which are hybrid forms between living organisms and machines creating an evolutionary vector between machines and nature. In collaboration with the scientists of LabMET, Faculty of Bio-Engineering at the University of Ghent, Henriques developed a […]
          Jimmy Butler brings everything the Timberwolves need. And more.        

When he first met with the media near midnight on Thursday after the 2017 NBA draft, Tom Thibodeau summoned his vaunted discipline in the service of diplomacy.

As the President of Basketball Operations for the Minnesota Timberwolves, Thibs had just executed the first bold, franchise-defining move of his 14-month tenure. Up until then, he and Wolves general manager Scott Layden had kept their powder dry: taking the consensus “best player available” a year ago with the 5th pick in the 2016 draft; meting out relatively paltry dollar amounts for a trio of journeymen during what was the most profligate free agent spending spree in NBA history in the summer of ’16; and then spending the entire 2016-17 season evaluating their roster while force-feeding playing time to their promising stockpile of young talent.

As he sat down to face the media late at night on Thursday, Thibs had one last act of restraint left to execute: Describe the process and ramifications of his blockbuster trade with the Chicago Bulls earlier that evening without getting up and tap-dancing on the table.

He began by praising the players headed to Chicago, saying that he “hated to part ways” with high-flying guard Zach LaVine and hard-scrabble defender Kris Dunn. “Not only are they good players, they are good people,” he said.

Including the seventh overall pick in that night’s draft as part of the trade also “wasn’t an easy decision” Thibs claimed, “because there are a lot of good players out there.”

Uh huh. But those three assets fetched Jimmy Butler, merely the best player the Timberwolves have ever acquired via a trade in the 28-year history of the franchise and an absolutely perfect fit for the team under their current circumstances.

For the most part, however, Thibs remained steadfast in his diplomacy. When asked if the deal would have fallen apart if the Bulls hadn’t also thrown in their own first-round pick — the 16th overall, just nine spots below the Wolves slotted choice — he kept a straight face, claiming that because LaVine is “terrific” and the Wolves were “giving up a lot…we knew we had to get back multiple assets ourselves. We thought it was a fair deal.”

Then someone asked if Thibs had spoken to Butler yet, and for a blazing second, the stoicism slipped. “Ah, yeah, I did talk to him,” he replied. “I think any time a player gets traded there are mixed emotions. Obviously he is leaving a lot of memories and friends behind; teammates and things like that.”

Then, in the midst of the final sentence of his answer, the feelings of pure joy and exultation won the war for control over the features of Thibs’ face: “But he is looking forward to coming here, I can tell you that.”

Blood, sweat and peers

The notion of “blood brothers” is recognized as a profound aspect of some human relations. Less common is the concept of a “blood father” and “blood son,” but it translates well to what exists between Thibodeau and Jimmy Butler.

There is an unspoken oath of loyalty, as enduring as any family ties, borne by a shared ethos for what constitutes success, dignity and honor, and what is required to survive and thrive within that value system.

Down to their respective marrows, Thibs and Butler understand the compensations of sweat equity. Thibs clawed his way up the assistant coaching ladder for 20 years before finally getting the head coaching job with the Bulls in 2010. The next season, Chicago selected Jimmy Butler with the 30th and final pick of the first round in the 2011 NBA draft.

“I think back to the opportunity I had to first be coaching him,” Thibs recalled about Butler Thursday night. “It was the lockout year”—NBA owners shut down the 2011-12 season until Christmas Day to exact concessions from the players union —“so we had like three or four days and then we went into the lockout. For a rookie that’s tough.

“So he missed summer league, he missed being in the gym all summer, and fall practice, and then when the season began it was a condensed schedule so you didn’t have a lot of practice time.

“But once we got back I will always remember that he and Luol Deng would come in every night. And I would look down from my office and they’d be working out together. And at that time he was a little shy. Luol would come up and tell me how good he was and said he should be playing, and he was his advocate. But I just watched the way he worked. And then the first opportunity he had to play was in Madison Square Garden against [star forward] Carmelo [Anthony]. We had injuries and he was a rookie, so I didn’t know what would happen. And he played great. So that told me a lot about him.”

Butler ranked 13th on the team in minutes-played per game that rookie season. The next year he was up to 5th. He led his team — and ranked second and then first in the entire NBA — in minutes per game the next two seasons. He got better and better in nearly every aspect of the game, while Thibodeau relentlessly fostered and milked the improvements.

In their fourth and final season together in Chicago, Butler was given the NBA Most Improved Player award. Over a six-week period during the ensuing off-season, the Bulls management fired Thibs and awarded Butler a 5-year, $95-million contract.

As Thibodeau licked his wounds and went on a season-long sabbatical, picking the brains of other successful NBA franchises while receiving the paychecks still owed him by Chicago, Butler retained his pattern of continued improvement during the 2015-16 season. He had clearly usurped the status of team leader from erstwhile all-stars Derrick Rose and Joakim Noah, and less than two months into his first NBA experience without Thibs on the sidelines, he blatantly criticized Thibodeau’s successor, Fred Hoiberg, for a lack of intensity.

Specifically, Butler said: “I believe in the guys in this locker room, but I also believe we probably have to be coached a lot harder at times. I’m sorry, I know Fred is a laid-back guy and I really respect him for that, but when guys aren’t doing what they are supposed to do, you have to get on guys — myself included.”

Serendipity squared

It is remarkable — pinch-me I’m dreaming incredible, actually — how well the acquisition of Butler addresses the myriad flaws and question marks that plagued the Timberwolves last season and cast a shadow on the ability of Thibs to recreate the purposeful synergy he unfurled both as an assistant in Boston and as the head man in Chicago.

If you could order up a special elixir to cure the Wolves longstanding ills while maximizing their current assets, you’d get a veteran leader celebrated enough to ascend to the top of a pecking order that includes burgeoning stars Karl-Anthony Towns and Andrew Wiggins, young enough to sustain his luster and influence as KAT and Wigs blossom, and dedicated and selfless enough not to impede their growth if and when they surpass his capabilities.

That leader would have a deep experience and appreciation for the unyielding mania that is the Thibodeau coaching method, and be able to project, by word and deed, on the court and in the locker room, what elements of that mania to bank, what elements to discount, and how to beneficially ride out the experience over the course of a long season. He wouldn’t be afraid to call it as he sees it and feels it, whether the calling is directed at Thibs, KAT, Wigs or any other member of the team. As the greatest individual success story of Thibodeau’s career in coaching, the leader would have the wisdom from a player’s perspective of knowing when to apply and when to relieve the pressure wrought Thibodeau on his fellow teammates.

Jimmy Butler is all of those things. And more.

At the end of last season two pertinent questions about the future course of the Wolves stood out. One was, Can Thibodeau ever instill the type of discipline, dedication and comprehension necessary to get this young crew to play quality defense? The other was: Can the Wolves ever leverage the often redundant virtues and vices that Wiggins and LaVine share in a manner that wrings maximum value out of both players?

For question one, Butler is one of four elite prototypes that can best demonstrate how to capably perform within Thibodeau’s defensive schemes (and the others, Deng, Noah, and Kevin Garnett, are all past their primes). For question two, at 6-7 in height and 220 pounds, Butler is a great fit at small forward. Wiggins and LaVine both demonstrated that their best positon was shooting guard, and attempts to play LaVine at the point or Wiggins at forward always stunted their growth. Now LaVine is gone and Wiggins can slide into the backcourt slot that gives him a height advantage and reduces the pounding on his thin frame.

Last season, the Wolves were soft. The only players on the roster with a real edge to their attitude were rookie Kris Dunn and otherwise hapless forward Adreian Payne. Butler’s presence and prominence injects a vital toughness into the team’s identity that neither the ostentatiously modest Towns nor the gnomically stoic Wiggins can muster. As Thibs told Sports Illustrated two years ago, “If they don’t bite as puppies, they usually don’t bite. Jimmy was biting right from the start.”

Then there is the matter of salary. Figuring out how to add a proven veteran leader to a roster that was already staring at a pair of sure-fire maximum contracts for Wiggins this autumn and Towns in the fall of 2018, plus a deal in the neighborhood of $15-$20 million to retain LaVine (provided he fully recovers from his ACL tear last season), seemed like a daunting task that would financially hamstring other repairs and reinforcements necessary for a championship contender.

Fortunately, Butler signed his five-year deal before the latest media deal blew an inflationary bubble into the NBA’s revenue-sharing salary cap structure. Thus, Butler is on the books for the relatively bargain price of about $20 million per season over the next two years. That’s about what Portland bench player Allen Crabbe will earn.

The playoffs are now a reasonable expectation

So, is the Jimmy Butler trade perfection? Nah. But if you survey the big picture situation and weight the Wolves’ needs it addresses according to importance and alternative solutions, it is probably 70 percent perfect.

Butler improved the accuracy and frequency of his three-point shooting last season over his career norms, but there is no question that the loss of LaVine makes the trade a net-deficit for long-range missives for a franchise that ranked last in three-point attempts and makes last season. The deal also diminishes team depth if you consider that LaVine, Dunn, and the No. 7 pick all likely would have been rotation players for the Wolves by the end of the upcoming season. It is unlikely that the No. 16 pick Justin Patton will be that far along by early 2018.

But let’s not mince words here: This is a fabulous turn of events for the Minnesota Timberwolves. It clarifies the pecking order and crystallizes the mission and identity of the team. It removes the pressure of leadership from a couple of kid-stars who were not ready to lead, and adds the pressure of becoming a winner to their plate at precisely the time when they need to put up or shut up regarding their value as team-enhancing performers.

“As many of you know, Jimmy is just going into the prime of his career,” Thibs said of the 27-year old Butler. “The thing that you can’t overlook with him is his playmaking. You are getting a two way player. You are getting a guy who can score in a lot of different ways, defend multiple positions — he can actually guard four positions well. He makes big shots late, plays the right way, is tough, practices hard, smart.”

Have there been bigger trades in the history of the franchise? Yes, you would have to say that dealing Kevin Garnett to Boston qualifies, and perhaps the swap that put Kevin Love in Cleveland and brought Wiggins to Minnesota three years ago qualifies.

But the crucial difference in those examples is that the Wolves were dealing from a position of weakness. The Garnett trade occurred in the exhausted aftermath of three straight trips out of the playoffs after eight straight postseason appearances. Dealing Love was prompted by his announcement that he would exercise his option and declare for free agency.

This time out it was the Wolves trading partner that was enervated and on the decline. This time out one of the top 10-15 players in the NBA (according to analytics and eye test; you could bump it higher if you consider the value additions of Butler’s age, salary, two-way prowess and overall leadership and fit) is coming to Minnesota.

For the first time in their history, the Wolves have been able to pounce on the availability of an existing star.

For the first time in a dozen years, the playoffs are a reasonable expectation — and a first step into a tantalizing future.

          Jewelry and Industrial Processes class        

@Me1 wrote:

I’ve applied for a $1,350 grant from the Michigan Council for Arts and Cultural Affairs to take a workshop on jewelry and industrial processes. This is becoming a big topic for me as a writer, although I wonder if it is somewhat controversial (if not misunderstood.) This class, taught by Don Friedlich, may not quite have enough students, tho, to make it a go. So I've offered to help spread the word.

Don's Designing in Multiples Workshop takes place October 7 - 8, 9 a.m. – 5 p.m. at Maryville University, St. Louis, MO. The cost is $400. See more at

The reason I'm interested is because I've been studying industrial manufacturing processes at a local community college. I feel it could be a way for me to produce cheaper pieces faster, in addition to my artistic pieces, This would allow me to offer customers options and save some wear and tear on my hands. In Don's workshop, we are to spend some time making prototypes out of cardboard. Then he will lecture on industrial processes and help us to make creative use of these concepts -- such as casting, laser and water jet cutting, commercial photo etching, etc..

Don is internationally known. To see his work go to

Posts: 3

Participants: 2

Read full topic

           Modelling concept prototype competencies using a developmental memory model         
Baxter, Paul and De Greeff, Joachim and Wood, Rachel and Belpaeme, Tony (2013) Modelling concept prototype competencies using a developmental memory model. Paladyn, Journal of Behavioral Robotics, 3 (4). pp. 200-208. ISSN 2080-9778
           Infiniti met retro-concept naar Pebble Beach         
Infiniti geeft aan een bijzonder prototype mee te nemen naar Pebble Beach. Het merk laat alvast een teaser zien.
           Housing prototypes for the wellbeing of elderly in Lincolnshire         
Paranagamage, Primali and Herron, Rebecca and Vilalta-perdomo, Eliseo Luis and Jackson, Jennifer (2013) Housing prototypes for the wellbeing of elderly in Lincolnshire. Working Paper. unpublished.
          The Secret Behind The Invention of Spanx        
When most people think about Spanx and how they were invented, people often mention Inventor Sara Blakely cutting the feet off of her pantyhose to create the first prototype. But the secret wasn’t in the prototype per se It was in the metaphor that drove her to cut the feet of those hose. —“Shapewear is the canvas and […]
          How to Make Sure Prototypes are Useful, Even When They Fail        
It worked flawlessly for 4 minutes and 25 seconds… And then it didn’t.  The VP smiled and said, “I get the idea.”  After getting through the embarrassment of the failure, the team learned what went wrong, and got to work testing variations of the failed component.  The new versions didn’t fail, and the product went on […]
          BumpTop™ Prototype        

Keepin' it Real: Pushing the Desktop Metaphor with Physics, Piles and the Pen

Can we combine it with XGL-Compiz together?

from P'Art

tags :
          Inspiring Makers- and Getting Inspired, Our GHC Hardware Hackathon Workshop        
Stacie & I making sure we’re on the room’s agenda… and we are! At the 12,000 participant Grace Hopper Convention in Houston, TX last week, we conducted a hackathon-with-a-workshop. Stacie Hibino (@staciehibino) and I helped 250+ participants manipulate a simple LED circuit kit into a prototype or actual product. Our competition was stiff for that […]
           Towards real-time VMAT verification using a prototype, high-speed CMOS active pixel sensor         
Zin, Hafiz M. and Harris, Emma J. and Osmond, John P. F. and Allinson, Nigel M. and Evans, Philip M. (2013) Towards real-time VMAT verification using a prototype, high-speed CMOS active pixel sensor. Physics in Medicine and Biology, 58 (10). pp. 3359-3375. ISSN 0031-9155
          Lowongan Kerja UI/UX Designer Lead         
S1 Bachelor/Diploma degree in any major with minimum 3-5 years of experience as UI/UX or Web DesignerTechnical design wireframe prototype user research and usability test skills are preferredHave an exceptional portfolio of shipped products containing ...

          Kissenger "simulates" kissing loved ones        

Touted as Earth's "first mobile kiss messenger," Kissenger is a rubbery-looking dock that humans put their phones in. It has a tactile surface they depress with their meat. The movements are then transmitted in realtime over the internet, so that a replica of them may be experienced by another human.

Plug in to your phone and give your loved ones a kiss over the Internet. Kissenger can sense your kiss and transmit realistic kissing sensations to your partner in real time. You can also feel the force on your lips when your partner kisses you back. Share an intimate moment with your friends and families while chatting with them on your phone.

The device comprises six sensors, corresponding actuators, and a meat-colored silicone sheath. There's an app that goes with it so the humans can interact on a audiovisual-discursive level at the same time. It's at the prototype stage with nothing to buy, yet, but obviously we should keep an eye on this. It should suffice to say that our previous recommendations with respect to establishing contact with this species have not changed.

High precision force sensors are embedded under the silicon lip to measure the dynamic forces at different parts of your lips during a kiss. The device sends this data to your phone, which transmits it to your partner over the Internet in real time. Miniature linear actuators are used to reproduce these forces on your partner's lips, creating a realistic kissing sensation. Kissenger provides a two-way interaction just like in a real kiss. You can also feel your partner's kiss on your lips when they kiss you back.

[via The Verge]

          You can now play the prototype for Keiji Inafune’s Red Ash        
Red Ash, a new game from Keiji Inafune and Comcept, now has a playable prototype online. The game, which has eight days left to earn over $300,000 on Kickstarter, can now be played in browser if you have Unity Web Player installed. Alternatively, instructions for how to download the game can be found in the […]
           10 Teknologi Karya Anak Bangsa Yang Membuat Bangga Indonesia        
1. Panser Anoa

Namanya terilhami dari mamalia khas Sulawesi, Anoa tampilannya tidak kalah dengan buatan Eropa. Kelahirannya disiapkan untuk mewujudkan kemandirian di bidang alutsista oleh Departemen Pertahanan dan PT Pindad. Panser beroda 6 ini mampu melaju hingga kecepatan 90 Km/jam. Mampu melompati parit selebar satu meter dan menanjak dengan kemiringan sampai dengan 45 derajat. Panser ini dilapisi baja anti peluru yang apabila diberondong dengan AK47 atau M-16 dijamin tidak akan tembus.

2. Pesawat Gatotkaca N-250

Pesawat ini adalah pesawat regional komuter turboprop rancangan asli IPTN (Sekarang PT. Dirgantara Indonesia) Diluncurkan tahun 1995. Kode N artinya Nusantara, menunjukan bahwa desain, produksi dan perhitungannya dikerjakan di Indonesia atau bahkan Nurtanio, yang merupakan pendiri dan perintis industry penerbangan di Indonesia. Pesawat ini diberi nama Gatotkaca dan primadona IPTN merebut pasar kelas 50-70 penumpang.

3. KRI-Krait-827

Kapal perang ini merupakan hasil saling tukar ilmu antara TNI AL lewat fasharkan (Fasilitas Pemeliharaan dan Perbaikan) Mentigi dan PT Batan Expressindo Shipyard (BES), Tanjung Guncung. Dikerjakan selama 14 Bulan dan 100 % ditangani oleh putra-putri Indonesia. Berbahan baku aluminium, bertonase 190 DWT dengan jarak jelajah sekitar 2.500 Mil. Dilengkapi dengan radar dengan jangkauan 96 Nautical Mil (setara 160 Km) dengan system navigasi GMDSS area 3 dengan kecepatan terpasang 25 Knots.

4. Smart Eagle II (SE II)

Merupakan Prototype pertama UAV (Unman Aerical Vehicle) yang dibuat PT. Aviator Teknologi Indonesia guna kepentingan intelegen Indonesia. SE II menggunakan mesin 2 tak berdiameter 150cc, mampu terbang hingga 6 Jam. Dilengkapi dengan colour TV Camera. Mampu beroperasi dimalam hari dengan menggunakan Therman Imaging (TIS) kamera untuk opsi penginderaannya.

5. Mobil Arina-SMK

Mobil ini dirancang menggunakan mesin sepeda motor dengan kapasitas mesin 150cc, 200cc dan 250cc. Konsumsi bensin hanya 1 liter untuk 40 km. Panjang 2,7 meter, lebar 1,3 meter dan tinggi 1,7 meter sehingga bisa masuk jalan dan gang yang sempit. Dinamakan Arina-SMK karena pembuatannya bekerja sama dengan Armada Indonesia (Arina) dengan siswa-siswa SMK.

6 senjata baru buatan Indonesia
Patutlah Kita Bangga Terhadap Indo Yang Sedang Maju Di bidang Militer

7. Chip Asli Buatan Indonesia

Chipset Wimax Xirka, sebuah chip yang dibuat oleh orang Indonesia asli, Bukan usaha mudah memang membuat chip dengan kompleksitas yang cukup tinggi. Xirka yang dikawal beberapa engineer Indonesia ini mulai dikembangkan pada 2006.

Chipset ini terdiri dari dua spesifikasi, yakni Chipset Xirka untuk Fixed Wimax dan Chipset Xirka untuk Mobile Wimax. Untuk fixed wimax telah diluncurkan pada Agustus tahun ini. Sedangkan mobile wimax rencananya diluncurkan pada quartal keempat 2009.

Produk asli buatan Indonesia ini diluncurkan langsung oleh Menteri Riset dan Teknologi Republik Indonesia, Kusmayanto Kadiman. Beliau menjelaskan, seluruh komponen di dalam Xirka merupakan buatan Indonesia. Operator yang memberikan layanan Wimax wajib menggunakan Xirka.

Wakamini adalah komputer tablet seratus persen buatan Indonesia dengan harga yang bisa dijangkau siapapun, Rp3.599.000.
Dari sisi layarnya, desain komputer tablet lokal ini tergolong unik yang dapat diputar 180 derajat, berukuran 10 inchi dan memiliki empat varian warna; coklat, merah, hitam dan biru.
Sistem operasinya menggunakkan Windows 7 Home Premium dan Ultimate, dengan layar sentuh.
Wakamini dibekali dengan prosesesor Intel Atom N450 berkecepatan 1,66 Ghz,RAM (Random Access Memory) sebesar 1 GB DDR2, Ruang penyimpanan 250 GB, sedangkan bobotnya 1,35 KG dengan Wifi “4 in 1″ card reader dan 1,3 MP Webcam dengan batrai berdaya tahan tiga jam.

Tablet pc zyrex wakatobi mini 963

Sementara Wakatobi berukuran lebih mungil dan didesain untuk anak-anak, hanya 8,9 inchi. Desain layarnya juga unik, demikian pula harganya yang cukup menarik Rp2,999.000 per unit atau termurah di kelasnya.
Tapi berbeda dari saudaranya, Wakatobi tidak mendukung fitur multi touch.
Prosesor Wakatobi adalah Intel Atom N270 berkecepatan 1,6 Ghz, dengan RAM 1 GB DDR2. Bobotnya hanbya 1,25 kg.

Lembaga Pengkajian Teknologi (Lemjitek) TNI AD, Karangploso, Kabupaten Malang, mampu menciptakan robot tempur.

prototype robot tempur ini sudah beberapa kali diujicobakan,dan mampu menempuh jarak hingga 1 km dari pusat kendali. ”Ukurannya 1,5 m kali 0,5 m dengan berat sekitar 100 kg. Robot ini memiliki mesin penggerak dua roda,dan mampu mengangkut beban hingga sekitar 150 kg, kecepatan maksimalnya bisa mencapai 60 km/jam,” terangnya. Robot yang diciptakan pada tahun 2009 dan belum memiliki nama ini, digerakkan dengan tenaga listrik dari dua baterei yang tersimpan di dalam bodi robot.

Dua baterei ini memiliki kekuatan 36 volt yang berfungsi untuk penggerak, dan 12 volt untuk sistem kontrolnya. Gunawan mengaku, kondisi robot ini belum sepenuhnya sempurna karena baru selesai proses perakitannya, kemungkinan masih sekitar 70-80% dari kondisi ideal yang diinginkan.


Walaupun, roket RX-420 masih jadi pertimbangan Departemen Pertahanan, apakah mampu menjadi salah satu senjata penangkal di darat yang dapat diandalkan sehingga, Indonesia tidak memerlukan armada kapal atau senjata perang lainnya, selain faktor biaya yang dominan besar.

ide produksi rudal dalam negeri mulai tercetus tahun 2005. Dana sebesar Rp 2,5 miliar digelontorkan untuk proyek pembuatan rudal pada tahun itu, dan bila itu terwujud Dephan akan menggandeng PT Pindad Indonesia, pabrik senjata dalam negeri yang melakukan penelitian hulu ledak kaliber 122 milimeter.

Saat ini, LAPAN telah berhasil meluncurkan roket dengan kekuatan jarak tempuh 100 kilometer, dan memiliki kecepakatan luncur awal 4 kali kecepatan suara
          5 Pesawat Rancangan Indonesia di Era Modern        
Kisruh pengadaan pesawat MA-60 milik PT Merpati Nusantara Airlines buatan Xi’an Aircraft International Company semakin menuai kontroversi. Mulai dari harga yang terlalu mahal, kualitas barang yang buruk, sampai negosiasi ulang kontrak yang berlarut-larut. Padahal sebetulnya untuk kelas pesawat yang sama, PT. DI memiliki jenis pesawat CN 235 yang kompetitif, sudah teruji kehandalannya dan terpakai oleh beberapa negara dunia, termasuk diantaranya Amerika. Berikut 5 pesawat buatan Indonesia di dunia penerbangan modern dilihat dari tipe dan kelasnya. Sebagai catatan beberapa pesawat diantaranya sudah ada yang diproduksi, namun ada juga sekedar prototype yang sudah lulus uji aerodinamika.

1. Pesawat N-2130
N-2130 adalah pesawat jet komuter berkapasitas 80-130 penumpang rancangan asli IPTN yang sekarang bernama PT Dirgantara Indonesia. Menggunakan kode N yang berarti Nusantara menunjukkan bahwa desain, produksi dan perhitungannya dikerjakan di Indonesia atau bahkan Nurtanio, yang merupakan pendiri dan perintis industri penerbangan di Indonesia.

Pada 10 November 1995, Presiden Soeharto mengumumkan proyek N-2130. Soeharto mengajak rakyat Indonesia untuk menjadikan proyek N-2130 sebagai proyek nasional. N-2130 yang diperkirakan akan menelan dana dua milyar dollar AS itu, akan dibuat secara gotong-royong melalui penjualan dua juta lembar saham dengan harga pecahan 1.000 dollar AS. Untuk itu, dibentuklah perusahaan PT. Dua Satu Tiga Puluh (PT DSTP) untuk melaksanakan proyek besar ini.

Saat badai krisis moneter 1997 menerpa Indonesia, PT DSTP limbung. Setahun kemudian akibat adanya ketidakstabilan politik dan penyimpangan pendanaan, mayoritas pemegang saham melalui RUPSLB (Rapat Umum Pemegang Saham Luar Biasa) 15 Desember 1998 meminta PT DSTP untuk melikuidasi diri. Imbasnya proyek N-2130 menjadi terbengkalai.

2. Pesawat N-250
N-250 adalah pesawat regional komuter turboprop rancangan asli IPTN atau PT. DI sekarang. Menggunakan kode N yang berarti Nusantara menunjukkan bahwa desain, produksi dan perhitungannya dikerjakan di Indonesia atau bahkan Nurtanio, yang merupakan pendiri dan perintis industri penerbangan di Indonesia. Pesawat ini diberi nama gatotkoco (Gatotkaca).

Pesawat ini merupakan primadona IPTN dalam usaha merebut pasar di kelas 50-70 penumpang dengan keunggulan yang dimiliki di kelasnya (saat diluncurkan pada tahun 1995). Menjadi bintang pameran pada saat Indonesian Air Show 1996 di Cengkareng. Namun akhirnya pesawat ini dihentikan produksinya setelah krisis ekonomi 1997. Rencananya program N-250 akan dibangun kembali oleh B.J. Habibie setelah mendapatkan persetujuan dari Presiden Susilo Bambang Yudhoyono dan perubahan di Indonesia yang dianggap demokratis. Namun untuk mengurangi biaya produksi dan meningkatkan daya saing harga di pasar internasional, beberapa performa yang dimilikinya dikurangi seperti penurunan kapasitas mesin, dan direncanakan dihilangkannya Sistem fly-by wire.

3. Pesawat CN-235
CN-235 adalah sebuah pesawat angkut turboprop kelas menengah bermesin dua. Pesawat ini dirancang bersama antara IPTN Indonesia dan CASA Spanyol. Pesawat CN-235, saat ini menjadi pesawat paling sukses pemasarannya dikelasnya.

CN-235 adalah pesawat terbang hasil kerja sama antara IPTN atau Industri Pesawat Terbang Indonesia (sekarang PT.DI) dengan CASA dari Spanyol. Kerja sama kedua negara dimulai sejak tahun 1980 dan purwarupa milik Spanyol pertama kali terbang pada tanggal 11 November 1983, sedangkan purwarupa milik Indonesia terbang pertama kali pada tanggal 30 Desember 1983. Produksi di kedua negara di mulai pada tanggal Desember 1986. Varian pertama adalah CN-235 Series 10 dan varian peningkatan CN-235 Seri 100/110 yang menggunakan dua mesin General Electric CT7-9C berdaya 1750 shp bukan jenis CT7-7A berdaya 1700 shp pada model sebelumnya.

4. Pesawat N-219
N-219 adalah pesawat generasi baru, yang dirancang oleh Dirgantara Indonesia dengan multi sejati multi misi dan tujuan di daerah-daerah terpencil. N-219 menggabungkan teknologi sistem pesawat yang paling modern dan canggih dengan mencoba dan terbukti semua logam konstruksi pesawat terbang. N-219 memiliki volume kabin terbesar di kelasnya dan pintu fleksibel efisiensi sistem yang akan digunakan dalam misi multi transportasi penumpang dan kargo. N-219 akan melakukan uji terbang di laboratorium uji terowongan angin pada bulan Maret 2010 nanti. Pesawat N219 baru akan bisa diserahkan kepada kostumer pertamanya untuk diterbangkan sekira tiga tahun atau empat tahun lagi. N-219 merupakan pengembangan dari NC-212.

5. Pesawat NC-212
NC-212 Aviocar adalah sebuah pesawat berukuran sedang bermesin turboprop yang dirancang dan diproduksi di Spanyol untuk kegunaan sipil dan militer. Pesawat jenis ini juga telah diproduksi di Indonesia di bawah lisensi oleh PT. Dirgantara Indonesia. Bahkan pada bulan Januari 2008, EADS CASA memutuskan untuk memindahkan seluruh fasilitas produksi C-212 ke PT. Dirgantara Indonesia di Bandung. PT. Dirgantara Indonesia adalah satu-satunya perusahaan pesawat yang mempunyai lisensi untuk membuat pesawat jenis ini di luar pabrik pembuat utamanya. (SOURCE)

          Limit Regulations on Autonomous Vehicles        

Autonomous vehicles are no longer a fantasy that appears only in science fiction movies. In theory, the ongoing investments and research should allow many driverless cars to transition out of development and into widespread commercial use over the next decade. However, the typical precautionary principles that accompany new legislation act as a market barrier that will impede the commercialization of the technology. In August 2014, Google, which designed a prototype that removed the steering wheel, accelerator and brake pedals, was forced to reinstall these manual driver controls. This is one of the first examples of onerous regulations forcing the AV innovators to take a step back. To prevent these types of setbacks, policymakers should focus on clearing the existing roadblocks preventing the development of AVs and address the excessive restraints concerning the hypothetical dangers of their use. Only then, can the copious social benefits of AV technologies be realized. 

Autonomous vehicles are vehicles that can drive themselves. In other words, they are capable of sensing the environment and navigating by themselves. In 2013, the National Highway Traffic Safety Administrationreleased a classification system that partitions vehicle automation into 5 levels, ranging from level 0 (no automation) to level 4 (fully self driving automation). The first semi-autonomous vehicles appeared in 1984 with Carnegie Mellon University’s Navlab and ALV projects. Ever since, autonomous vehicle research has been increasing, with major progress by Google, major auto manufacturers, government organizations and universities. In early 2014, IHS Automotive released a study projecting a global total of “nearly 54 million” self-driving cars by 2035, and predicting that “nearly all of the vehicles on the road would be self-driving cars or self-driving commercial vehicles by 2050. 

In response to these predictions and the pressure to “legalize” the testing of AV technologies, new AV legislation to set policy has been introduced in many states, and enacted in California, Florida, Michigan, Nevada and the District of Colombia, to set policy. One important policy area these laws address is safety. For instance, most states’ rules mandate that a licensed driver be in the driver’s seat at all times during autonomous operation. Another policy area these laws address is liability. Typically, the laws set up a legal framework through which manufacturers can be held accountable for negligence in selling faulty products including strict liability for product defects or misrepresentation of product capabilities. One of the statutes enacted in the D.C., Florida, and Michigan laws contains specific language protecting original manufacturers from liability for defects introduced to the aftermarket by a third party who converts a conventional vehicle into an autonomous vehicle.

In some very specific and narrow respects, these state-level legislative actions regarding AV safety and liability can be beneficial. However, state laws also have the effect of slowing down the entry of new technology into the market. For example, California’s 2012 law on self-driving cars called for the state legislature to draft rules regarding AV operation by the start of 2015, although by mid-March 2015 there was still no draft available. In addition, the safety law that requires all autonomous vehicles have a human driver prevents the testing of fully automated vehicles. Hence, these laws effectively act as another set of onerous regulations preventing further innovation of autonomous vehicles.

While the concerns about safety are legitimate and the desire for comprehensive laws is reasonable, there are numerous reasons why it is poor policy to hinder the deployment of these new automotive technologies. First and foremost, self-driving vehicles are predicted to increase safety and decrease the number of accidents. Currently, more than 33,000 people die each year in the United States from automobile crashes. According to U.S. government estimates, as many as 90 percent of all car accidents are caused by human error. As such, the Eno Center for Transportation recently projected that if only 10 percent of all vehicles in the United States were self-driving, the number of accidents each year would be cut by 211,000, and 1,100 lives would be saved. Autonomous vehicles are the latest in a series of technologies that improve safety. For example, National Highway Traffic Safety Administration estimates that electronic stability control (ESC) systems, which use computer controls to brake individual wheels on a vehicle losing directional control or stability, saved 2,202 lives between 2008 and 2010. In addition, the legal precedents of products liability litigation established over the last half century provide AV manufacturers with a very strong set of incentives to make their products as safe as possible. Hence, encumbering the legal system with a new set of overly broad federal or state liability statutes relating to AVs is unnecessary.

Second, studies have suggested that intelligent vehicles can possibly reduce congestion and lower fuel consumption. Self-driving cars could ease congestion because commutes would be quicker, as cars driven by robots could travel at steadier speeds and avoid traffic jams. However, it is also true that the AV technology of Level 3 or higher is likely to substantially reduce the opportunity cost of congestion, hence inducing more people to drive, who would have otherwise not have made the trip. While the two possibilities may be offset and make it difficult to clearly predict the overall effects of AVs on congestion, most experts predict some decrease in congestion. When it comes to fuel consumption, the AV vehicles will allow for fuel efficiency because self-driving cars and trucks will be able to bunch close together at steadier speeds. The Rocky Mountain Institute estimates that the reduction in wind drag alone from vehicles traveling closely together could reduce fuel use by 20 to 30 percent. In addition, AV technology can improve fuel economy by 4–10 % by accelerating and decelerating more smoothly than a human driver. Finally, AVs might reduce pollution by enabling the use of alternative fuels. If the decrease in frequency of crashes allows lighter vehicles, many of the range issues that have limited the use of electric and other alternative vehicles are diminished.

Despite the social benefits policymakers have slowed down the legalization of AV research by imposing the “precautionary principle” on developing technology, potentially costing human lives. Hence, as argued by Adam Thierer and Ryan Hagemann of Mercatus Center, the optimal policymakers should adopt the “permission-less innovation” attitude towards driverless vehicle technologies and not the “precautionary principles” attitude. “Permission-less innovation” argues that experimentation with new technologies and business models should generally be permitted by default. This open and lightly regulated platform that allows entrepreneurs to adopt new business models and offer new services without first seeking approval from regulators. In this scenario, pre-emptively resolving liability issues would not be a precondition to commercial rollout of autonomous vehicles. Any perceived or actual problems with new technologies could be corrected later through better-informed policymaking. 

In conclusion, the laws that that encourage overregulation of AV by trying to pre-emptively tackle hypothetical concerns of safety and liabilities will increase the cost of human lives, health, property damages and convenience. As such, the overall guiding principle for policymakers should be that AV technology ought to be permitted if and when it is superior to average human drivers. Safety regulations and liability rules should be designed with this guiding principle in mind. As the 2013 article in the San Diego Union-Tribune stated bluntly: “The issue of liability, if not solved, could delay or even wipe out the vision of driverless cars gaining widespread consumer use.” With this in mind, the ultimate goal of the policymakers regarding the AV should be to minimize government intervention in order to speed up the introduction of the innovation and consumer availability of automated vehicles.

          Anyone interested in a Boba Fett Kubrick?        

Hi everyone,

I finally found a store here in Tokyo that is selling the Kubrick sets and selling them individually as well. The sets go for $90.00 The individual figures are $13.00
When i was there yesterday they only had the white prototype and both animated versions available for individual sell. If anyone is interested i would be happy to pick these up and send them to you at cost with no extra charges. The shipping for individual figures would be about $7.00, and for the sets I think it would run closer to $15.00
Anyway, considering what these things are going for on ebay I thought I would offer this to you guys. I didn't want the entire set so I was really happy to find a shop that sells them individually. 
Please let me know ASAP as they only had two boxes left and only the three I mentioned above for individual sell.
He's pretty cool!

          Nocturne - Mirrored Acrylic Prototype        
Prototype for Mirrored Acrylic Nocturne

          Intel is building a fleet of 100 self driving cars and it wants to start testing them this year        
Roads and highways are about to get more crowded with self-driving car prototypes. The latest company to join the race is Intel , the...
          Blood vessels prove you are you who you say you are         
Enable IntenseDebate Comments: 
Enable IntenseDebate Comments

Researchers at NTNU in Gjøvik, Norway, have developed a prototype for a sensor that can scan and record both your fingerprint and the flow of blood in the veins in your fingers. The sensor prototype is small and easy to use, and has the potential to effect big changes in airport safety and border crossings. It will enable authentication to be both fast and secure. Photo: Kenneth Kalsnes

read more

          Hyundai i10 Models pricing and features        

After the launch of Hyundai i10 the latest news is the the brand ambassador of this car is “Shahrukh Khan” and there are a total of 4 models of this car which include Hyundai i10 D-Lite, Hyundai i10 Era, Hyundai i10 Magna and Hyundai i10 D-Auto. The car offers 14-inches wheel’s, body colour bumpers, integrated six-speakers CD/MP3 stereos and automatic windows operation. The mileage of this car is ranged at 12-14kms/ltr. Though iam really confused regarding the usage of airbags in India because the average speeds come out at 20-30kms/hr and at this speed i dont think airbags would server any purpose and they are just an unusable addition to this car.

Pricing of Hyundai i10 Cars :
Hyundai i10 Magna (O) – MRP: Rs. 530,475 [Mumbai]
Hyundai i10 D-Lite – MRP: Rs. 364,026 [Mumbai]
Hyundai i10 Magna – MRP: Rs. 433,515 [Mumbai]
Hyundai i10 Era – MRP: Rs. 404,501 [Mumbai]

Hyundai i10 is driven by 1.1 litre petrol engine with the company expecting 1.5 lakh cars to be sold next year.


Hyundai is preparing the replacement for the Atos which has been on the market since 1997 namely the i10. The prototype shown in these photos is still under disguise however although we can see glimpses of the all new styling.

The i10 will offer a choice of three engines, a 1.1 litre fuel engine/65 HP, a 1.1 litre diesel/75 HP , and lateron a 1,2 litre fuel engine delivering 80 HP is going to join in.

The i10 will be facing a huge battle in what has become a very competitve market with the introduction of the new Fiat 500, and the impending arrival of the new Ford Fiesta.


* Compact, stylish, practical and fun-to-drive
* Air conditioning as standard across the range
* Set to double Hyundai’s sales in city car sector

While Hyundai’s rivals are still reeling from the astonishing acclaim awarded to the i30 hatchback, the company has released details of the next instalment in its all-new range of ‘i’ cars – the i10.

This exciting new city car is designed to be fun-to-drive, affordable and practical while offering quality and equipment that no other rival can offer at the price.

With diminutive dimensions of just 3,565mm long and 1,595mm wide, the i10 promises to be easy to thread through city traffic and a pleasure to park in tight car parks.

Pricing of all Hyundai i10 models in all indian cities based on models and versions :
Prices for AGARTALA
Model Price (Rs.)*
i 10 D-Lite (Solid) 339026
i 10 D-Lite (Metalic) 342768
i 10 Era (Solid) 374509
i 10 Era (Metalic) 378253
i 10 Magna (Solid) 398513
i 10 Magna (Metalic) 402257
i 10 Magna-O (Solid) 474759
i 10 Magna – O (Metalic) 478503
i 10 Magna-O with Sunroof (Solid) 490475
i 10 Magna – O with Sunroof (Metalic) 494218

Prices for AGRA
Model Price (Rs.)*
i 10 D-Lite (Solid) 340533
i 10 D-Lite (Metalic) 344294
i 10 Era (Solid) 376176
i 10 Era (Metalic) 379937
i 10 Magna (Solid) 400287
i 10 Magna (Metalic) 404045
i 10 Magna-O (Solid) 476869
i 10 Magna – O (Metalic) 480630
i 10 Magna-O with Sunroof (Solid) 492654
i 10 Magna – O with Sunroof (Metalic) 496414

Prices for AHMEDABAD
Model Price (Rs.)*
i 10 D-Lite (Solid) 347502
i 10 D-Lite (Metalic) 351339
i 10 Era (Solid) 383873
i 10 Era (Metalic) 387709
i 10 Magna (Solid) 408477
i 10 Magna (Metalic) 412314
i 10 Magna-O (Solid) 486626
i 10 Magna – O (Metalic) 490465
i 10 Magna-O with Sunroof (Solid) 502737
i 10 Magna – O with Sunroof (Metalic) 506574

Model Price (Rs.)*
i 10 D-Lite (Solid) 342269
i 10 D-Lite (Metalic) 346051
i 10 Era (Solid) 378091
i 10 Era (Metalic) 381873
i 10 Magna (Solid) 402324
i 10 Magna (Metalic) 406105
i 10 Magna-O (Solid) 479296
i 10 Magna – O (Metalic) 483076
i 10 Magna-O with Sunroof (Solid) 495164
i 10 Magna – O with Sunroof (Metalic) 498944

Prices for AIZAWL
Model Price (Rs.)*
i 10 D-Lite (Solid) 339027
i 10 D-Lite (Metalic) 342770
i 10 Era (Solid) 374508
i 10 Era (Metalic) 378252
i 10 Magna (Solid) 398516
i 10 Magna (Metalic) 402258
i 10 Magna-O (Solid) 474759
i 10 Magna – O (Metalic) 478502
i 10 Magna-O with Sunroof (Solid) 490476
i 10 Magna – O with Sunroof (Metalic) 494219

Prices for AJMER
Model Price (Rs.)*
i 10 D-Lite (Solid) 342264
i 10 D-Lite (Metalic) 346046
i 10 Era (Solid) 378088
i 10 Era (Metalic) 381867
i 10 Magna (Solid) 402320
i 10 Magna (Metalic) 406103
i 10 Magna-O (Solid) 479293
i 10 Magna – O (Metalic) 483074
i 10 Magna-O with Sunroof (Solid) 495162
i 10 Magna – O with Sunroof (Metalic) 498943

Prices for ALIGARH
Model Price (Rs.)*
i 10 D-Lite (Solid) 340530
i 10 D-Lite (Metalic) 344292
i 10 Era (Solid) 376175
i 10 Era (Metalic) 379933
i 10 Magna (Solid) 400284
i 10 Magna (Metalic) 404045
i 10 Magna-O (Solid) 476868
i 10 Magna – O (Metalic) 480629
i 10 Magna-O with Sunroof (Solid) 492655
i 10 Magna – O with Sunroof (Metalic) 496417

Prices for ALLAHABAD
Model Price (Rs.)*
i 10 D-Lite (Solid) 340535
i 10 D-Lite (Metalic) 344294
i 10 Era (Solid) 376175
i 10 Era (Metalic) 379934
i 10 Magna (Solid) 400287
i 10 Magna (Metalic) 404045
i 10 Magna-O (Solid) 476867
i 10 Magna – O (Metalic) 480627
i 10 Magna-O with Sunroof (Solid) 492654
i 10 Magna – O with Sunroof (Metalic) 496415

Prices for ALWAR
Model Price (Rs.)*
i 10 D-Lite (Solid) 342264
i 10 D-Lite (Metalic) 346045
i 10 Era (Solid) 378089
i 10 Era (Metalic) 381868
i 10 Magna (Solid) 402320
i 10 Magna (Metalic) 406103
i 10 Magna-O (Solid) 479292
i 10 Magna – O (Metalic) 483073
i 10 Magna-O with Sunroof (Solid) 495158
i 10 Magna – O with Sunroof (Metalic) 498941

Prices for AMBALA
Model Price (Rs.)*
i 10 D-Lite (Solid) 339027
i 10 D-Lite (Metalic) 342770
i 10 Era (Solid) 374508
i 10 Era (Metalic) 378252
i 10 Magna (Solid) 398516
i 10 Magna (Metalic) 402260
i 10 Magna-O (Solid) 474759
i 10 Magna – O (Metalic) 478502
i 10 Magna-O with Sunroof (Solid) 490476
i 10 Magna – O with Sunroof (Metalic) 494219

Prices for AMRAVATI
Model Price (Rs.)*
i 10 D-Lite (Solid) 342267
i 10 D-Lite (Metalic) 346048
i 10 Era (Solid) 378091
i 10 Era (Metalic) 381871
i 10 Magna (Solid) 402324
i 10 Magna (Metalic) 406106
i 10 Magna-O (Solid) 479295
i 10 Magna – O (Metalic) 483076
i 10 Magna-O with Sunroof (Solid) 495163
i 10 Magna – O with Sunroof (Metalic) 498945

Prices for AMRITSAR
Model Price (Rs.)*
i 10 D-Lite (Solid) 340151
i 10 D-Lite (Metalic) 343895
i 10 Era (Solid) 375634
i 10 Era (Metalic) 379378
i 10 Magna (Solid) 399639
i 10 Magna (Metalic) 403382
i 10 Magna-O (Solid) 475884
i 10 Magna – O (Metalic) 479626
i 10 Magna-O with Sunroof (Solid) 491600
i 10 Magna – O with Sunroof (Metalic) 495343

Prices for ANAND
Model Price (Rs.)*
i 10 D-Lite (Solid) 339026
i 10 D-Lite (Metalic) 342768
i 10 Era (Solid) 374510
i 10 Era (Metalic) 378254
i 10 Magna (Solid) 398514
i 10 Magna (Metalic) 402258
i 10 Magna-O (Solid) 474759
i 10 Magna – O (Metalic) 478502
i 10 Magna-O with Sunroof (Solid) 490475
i 10 Magna – O with Sunroof (Metalic) 494218

Prices for ASANSOL
Model Price (Rs.)*
i 10 D-Lite (Solid) 339027
i 10 D-Lite (Metalic) 342770
i 10 Era (Solid) 374510
i 10 Era (Metalic) 378254
i 10 Magna (Solid) 398514
i 10 Magna (Metalic) 402257
i 10 Magna-O (Solid) 474758
i 10 Magna – O (Metalic) 478502
i 10 Magna-O with Sunroof (Solid) 490474
i 10 Magna – O with Sunroof (Metalic) 494217

Model Price (Rs.)*
i 10 D-Lite (Solid) 343890
i 10 D-Lite (Metalic) 347689
i 10 Era (Solid) 379882
i 10 Era (Metalic) 383682
i 10 Magna (Solid) 404229
i 10 Magna (Metalic) 408027
i 10 Magna-O (Solid) 481565
i 10 Magna – O (Metalic) 485366
i 10 Magna-O with Sunroof (Solid) 497507
i 10 Magna – O with Sunroof (Metalic) 501307

Prices for BANGALORE
Model Price (Rs.)*
i 10 D-Lite (Solid) 339026
i 10 D-Lite (Metalic) 342768
i 10 Era (Solid) 374510
i 10 Era (Metalic) 378254
i 10 Magna (Solid) 398516
i 10 Magna (Metalic) 402260
i 10 Magna-O (Solid) 474758
i 10 Magna – O (Metalic) 478501
i 10 Magna-O with Sunroof (Solid) 490475
i 10 Magna – O with Sunroof (Metalic) 494218

Prices for BAREILLY
Model Price (Rs.)*
i 10 D-Lite (Solid) 340533
i 10 D-Lite (Metalic) 344294
i 10 Era (Solid) 376174
i 10 Era (Metalic) 379934
i 10 Magna (Solid) 400287
i 10 Magna (Metalic) 404048
i 10 Magna-O (Solid) 476869
i 10 Magna – O (Metalic) 480629
i 10 Magna-O with Sunroof (Solid) 492656
i 10 Magna – O with Sunroof (Metalic) 496416

Prices for BARODA
Model Price (Rs.)*
i 10 D-Lite (Solid) 345392
i 10 D-Lite (Metalic) 349135
i 10 Era (Solid) 380875
i 10 Era (Metalic) 384619
i 10 Magna (Solid) 404879
i 10 Magna (Metalic) 408622
i 10 Magna-O (Solid) 481123
i 10 Magna – O (Metalic) 484867
i 10 Magna-O with Sunroof (Solid) 496839
i 10 Magna – O with Sunroof (Metalic) 500583

Prices for BATHINDA
Model Price (Rs.)*
i 10 D-Lite (Solid) 340151
i 10 D-Lite (Metalic) 343895
i 10 Era (Solid) 375633
i 10 Era (Metalic) 379377
i 10 Magna (Solid) 399639
i 10 Magna (Metalic) 403382
i 10 Magna-O (Solid) 475883
i 10 Magna – O (Metalic) 479626
i 10 Magna-O with Sunroof (Solid) 491600
i 10 Magna – O with Sunroof (Metalic) 495343

Prices for BELGAUM
Model Price (Rs.)*
i 10 D-Lite (Solid) 339027
i 10 D-Lite (Metalic) 342771
i 10 Era (Solid) 374510
i 10 Era (Metalic) 378254
i 10 Magna (Solid) 398518
i 10 Magna (Metalic) 402262
i 10 Magna-O (Solid) 474759
i 10 Magna – O (Metalic) 478502
i 10 Magna-O with Sunroof (Solid) 490476
i 10 Magna – O with Sunroof (Metalic) 494219

Prices for BELLARY
Model Price (Rs.)*
i 10 D-Lite (Solid) 339028
i 10 D-Lite (Metalic) 342771
i 10 Era (Solid) 374511
i 10 Era (Metalic) 378255
i 10 Magna (Solid) 398518
i 10 Magna (Metalic) 402262
i 10 Magna-O (Solid) 474759
i 10 Magna – O (Metalic) 478502
i 10 Magna-O with Sunroof (Solid) 490476
i 10 Magna – O with Sunroof (Metalic) 494219

Prices for BHILAI
Model Price (Rs.)*
i 10 D-Lite (Solid) 339026
i 10 D-Lite (Metalic) 342770
i 10 Era (Solid) 374510
i 10 Era (Metalic) 378254
i 10 Magna (Solid) 398516
i 10 Magna (Metalic) 402260
i 10 Magna-O (Solid) 474758
i 10 Magna – O (Metalic) 478501
i 10 Magna-O with Sunroof (Solid) 490475
i 10 Magna – O with Sunroof (Metalic) 494219

Prices for BHILWARA
Model Price (Rs.)*
i 10 D-Lite (Solid) 342265
i 10 D-Lite (Metalic) 346046
i 10 Era (Solid) 378089
i 10 Era (Metalic) 381869
i 10 Magna (Solid) 402321
i 10 Magna (Metalic) 406103
i 10 Magna-O (Solid) 479293
i 10 Magna – O (Metalic) 483073
i 10 Magna-O with Sunroof (Solid) 495161
i 10 Magna – O with Sunroof (Metalic) 498943

Prices for BHOPAL
Model Price (Rs.)*
i 10 D-Lite (Solid) 342268
i 10 D-Lite (Metalic) 346049
i 10 Era (Solid) 378091
i 10 Era (Metalic) 381873
i 10 Magna (Solid) 402323
i 10 Magna (Metalic) 406104
i 10 Magna-O (Solid) 479295
i 10 Magna – O (Metalic) 483075
i 10 Magna-O with Sunroof (Solid) 495162
i 10 Magna – O with Sunroof (Metalic) 498942

Model Price (Rs.)*
i 10 D-Lite (Solid) 349350
i 10 D-Lite (Metalic) 353210
i 10 Era (Solid) 385913
i 10 Era (Metalic) 389773
i 10 Magna (Solid) 410647
i 10 Magna (Metalic) 414510
i 10 Magna-O (Solid) 489210
i 10 Magna – O (Metalic) 493071
i 10 Magna-O with Sunroof (Solid) 505406
i 10 Magna – O with Sunroof (Metalic) 509269

Prices for BIKANER
Model Price (Rs.)*
i 10 D-Lite (Solid) 342263
i 10 D-Lite (Metalic) 346046
i 10 Era (Solid) 378088
i 10 Era (Metalic) 381867
i 10 Magna (Solid) 402320
i 10 Magna (Metalic) 406102
i 10 Magna-O (Solid) 479293
i 10 Magna – O (Metalic) 483074
i 10 Magna-O with Sunroof (Solid) 495160
i 10 Magna – O with Sunroof (Metalic) 498941

Prices for BILASPUR
Model Price (Rs.)*
i 10 D-Lite (Solid) 339027
i 10 D-Lite (Metalic) 342770
i 10 Era (Solid) 374510
i 10 Era (Metalic) 378253
i 10 Magna (Solid) 398514
i 10 Magna (Metalic) 402256
i 10 Magna-O (Solid) 474758
i 10 Magna – O (Metalic) 478503
i 10 Magna-O with Sunroof (Solid) 490475
i 10 Magna – O with Sunroof (Metalic) 494219

Prices for BOKARO
Model Price (Rs.)*
i 10 D-Lite (Solid) 339026
i 10 D-Lite (Metalic) 342770
i 10 Era (Solid) 374509
i 10 Era (Metalic) 378252
i 10 Magna (Solid) 398514
i 10 Magna (Metalic) 402260
i 10 Magna-O (Solid) 474759
i 10 Magna – O (Metalic) 478501
i 10 Magna-O with Sunroof (Solid) 490476
i 10 Magna – O with Sunroof (Metalic) 494219

Model Price (Rs.)*
i 10 D-Lite (Solid) 339027
i 10 D-Lite (Metalic) 342771
i 10 Era (Solid) 374509
i 10 Era (Metalic) 378252
i 10 Magna (Solid) 398514
i 10 Magna (Metalic) 402258
i 10 Magna-O (Solid) 474759
i 10 Magna – O (Metalic) 478503
i 10 Magna-O with Sunroof (Solid) 490475
i 10 Magna – O with Sunroof (Metalic) 494219

Prices for CALCUTTA
Model Price (Rs.)*
i 10 D-Lite (Solid) 339026
i 10 D-Lite (Metalic) 342770
i 10 Era (Solid) 374510
i 10 Era (Metalic) 378253
i 10 Magna (Solid) 398514
i 10 Magna (Metalic) 402258
i 10 Magna-O (Solid) 474758
i 10 Magna – O (Metalic) 478503
i 10 Magna-O with Sunroof (Solid) 490475
i 10 Magna – O with Sunroof (Metalic) 494219

Prices for CALICUT
Model Price (Rs.)*
i 10 D-Lite (Solid) 339027
i 10 D-Lite (Metalic) 342772
i 10 Era (Solid) 374510
i 10 Era (Metalic) 378253
i 10 Magna (Solid) 398516
i 10 Magna (Metalic) 402260
i 10 Magna-O (Solid) 474757
i 10 Magna – O (Metalic) 478501
i 10 Magna-O with Sunroof (Solid) 490476
i 10 Magna – O with Sunroof (Metalic) 494220

Model Price (Rs.)*
i 10 D-Lite (Solid) 339026
i 10 D-Lite (Metalic) 342770
i 10 Era (Solid) 374510
i 10 Era (Metalic) 378254
i 10 Magna (Solid) 398516
i 10 Magna (Metalic) 402260
i 10 Magna-O (Solid) 474758
i 10 Magna – O (Metalic) 478501
i 10 Magna-O with Sunroof (Solid) 490475
i 10 Magna – O with Sunroof (Metalic) 494218

          Desain Motor Masa Depan -- Keren !!         

Teknologi permesinan dan desain makin maju,manusia juga makin pandai dalam membuat suatu model kendaraan bermotor,sebagian memang sudah menjadi prototype.
Berikut serba serbi desain motor masa depan :

          Kereta Kebal Pembunuh Mega Tentera Nazi Yang Paling Besar Dalam Sejarah!        

Kereta Kebal Pembunuh Mega Tentera Nazi Yang Paling Besar Dalam Sejarah!

• Pada tahun 1942, Adolf Hitler bersetuju dengan rancangan pembinaan kereta kebal ‘super’ besar yang saiznya sebesar 15 kali ganda daripada ukuran kereta kebal biasa.

Namun, pada 1943 projek ini dibatalkan oleh Menteri Peralatan Perang Jerman, Albert Speer.

• Sebenarnya para arkitek Nazi telah berhasil menyusun rancangan mereka dan hanya tinggal tahap pembinaan kereta kebal ini sebelum ianya dibatalkan secara penuh.

• Kereta kebal yang dinamakan sebagai Landkreuzer P. 1500 Monster ini dilengkapi dengan ‘meriam Krupp’ (Krupp cannon), berkaliber 800 milimeter, berbeza dengan kereta kebal normal yang biasanya hanya dilengkapi dengan meriam ‘M1 Abrams’ berkaliber 105 milimeter.

Meriam Krupp 800 mm adalah senjata meriam yang terbesar pernah dibuat, memiliki berat sekitar 7 tan & pelurunya dapat ditembak jauh sehingga mencapai jarak 37 kilometer.

• Kereta kebal Landkreuzer P. 1500 Monster ini memiliki panjang sekitar 42 meter, seberat 1,500 tan dan dikendalikan oleh 100 orang kru/tentera.

• Memiliki kekuatan yang cukup untuk mencapai kelajuan sekitar dari 10 sehingga 15 kilometer per jam.

• Alasan dibatalkan projek P. 15000 adalah berpunca daripada ‘prototype’ ini, kerana telah gagal memenuhi kaedah persyaratan kelajuan yang ditetapkan oleh perusahaan kereta kebal.

Ini bermakna kereta kebal yang ‘super’ besar seperti P 1500 akan cenderung bergerak lambat & kerana ukurannya yang sangat besar, kereta kebal ini akan menjadi sasaran utama setiap pesawat Pihak Bersekutu ketika dalam medan pertempuran.
          Hossack-style Steering on Agility Prototype        

OK, it's not an FF, but it is a prototype of a machine due to go into production some time soonish, and the Agility electric motorcycle does have a Funny Front End, ie not telescopic forks, so the steering is not steered with the suspension. It's very much based on the designs of Norman Hossack. Compare and contrast with Norman's own designs, and that of Peter Fouché's self-built Hossack-style steering for his FF, as described elsewhere on bikeweb. PNB
Photo: Paul Blezard at Cenex 2011, Rockingham

Hossack-style Steering on Agility Prototype
          SteelSeries Diablo III headset and mouse: demon slaying flair for your skull and desktop        
Sure, there are plenty of great gaming goodies to be found on the floor at E3, but some of the best stuff is much harder to find. For example, SteelSeries' new Diablo III Mouse and headset, which were revealed to us when the prototype devices were pulled from a backpack. The headset packs the same 50mm drivers, retractable boom mic, and overhead suspension design of its Siberia V2 stablemate, but adds some sinister-looking design tweaks. These cans come murdered out in matte black with lava red accents (which can be turned off) courtesy of 18 LEDs, plus matching red external speaker grilles. There's also a braided cord that plugs into your Mac or PC via USB.
We got to see the Diablo III mouse as well, and found it sports an ambidextrous design similar to the SteelSeries Xai. Its inky exterior is cloaked in smooth, soft touch plastic sporting silvery tribal accents, and a glowing ember Diablo III logo and trim around the scroll-wheel. Given Diablo's click-happy gameplay, Steelseries gave the mouse beefy Omron switches that are good for 10 million presses (which are backed by a five year warranty). All those clicks are tracked by custom driver software that also lets users tailor the pointer's button layout by dragging and dropping character-specific commands to the button of your choosing. Now that your appetite has been whetted with the design descriptions, here are the details that matter most: the mouse will cost $69 and the headset $119 when SteelSeries starts selling 'em at BlizzCon in October.
          Kudo Tsunoda doesn't tell us a thing about Windows 8 support for Kinect        
Microsoft asked us to drop by at E3 to chat up a generically identified "Xbox executive," imagine our surprise when we found Kudo Tsunoda hiding behind door number six. Wearing his signature shades, Kudo gave us a brief primer on what makes Kinect great, or at least a heavy endorsement of Kinect Fun Labs. While we can certainly dig the bite-sized gadgetry Fun Labs has to offer, we like to dig deeper -- will the Kinect hardware ever be integrated into other devices? Any plans for Microsoft's 3D tracking camera and Windows 8? Kudo did his best to feed our ravenous appetite for answers.

It wasn't easy on poor Kudo, of course -- more than once we saw him glance longingly at his wrangler, a friendly PR rep keeping him from spilling the beans on anything too awesome. Probing about Windows 8 and Kinect produced one such look, and while our man very lightly suggested that Microsoft does more than games -- and that the dual-camera device might start showing up on other devices -- the big M had nothing to announce at that time. Kudo did go on to say, however, that we can expect to see new, innovative, "oh my gosh, I can't believe Kinect can do that" experiences at E3 year after year, citing this year's keynote for some recent examples. While we couldn't coax any leaks about Xbox or Kinect successors out of the man, it's good to see Microsoft's continued dedication to improving the platform.
          Nox Audio Admiral Touch prototype preview redux: now with more clever ideas and some bass        
It's been fully five months since Nox Audio's everything-but-the-kitchen-sink Admiral Touch headset prototype wowed us at CES 2011, and boy, have things changed. That ugly metal band is gone, replaced by a handsome black and silver rig, with a neatly integrated adjusting strap for a comfortable noggin squeeze. Both sides of the Admiral Touch now sport buttons, including one to add the T-Pain Effect (we kid you not). More after the break.
          Let's design the world's best beach detector! (19 replies)        
About a month ago I posted a thread here about the new French developed Manta Pulse induction technology which First Texas has acquired. Several folks made thoughtful comments about it and seem generally pleased to hear that we can expect the new beach machine in the not so very very distant future from Fisher.,2374832,2376995#msg-2376995

It occurred to me this morning that First Texas might like to hear OUR ideas about the ideal Beach detector - and hear them during the period in which they are working to turn Alexander Tartare's Manta prototype into a marketable beach detector which introduces truly new Technology. Engineers will design what they think is the best that can be done with the technology available – but marketing are the people who have to go out and sell the product in order to do that they need a really good idea as to what will actually sell. With that information they can tell engineering what they need and engineering can pick and choose from all the possibilities to give it to them. So right now we may have an opportunity to get a major manufacturer to actually tailor their product very directly to do the jobs that we need to be done.

So – given that we're talking about a pulse induction detector – one that can discriminate out iron – what would it take to make the thing perfect for your conditions, your beaches, your targets? What specific features in terms of its design performance user interface and mechanical package would it take to make you feel like you've got the worlds best beach detector?

Right now there are number of good to adequate machines out there – but all of them fall short in the opinion of many on this forum of the ideal. whether it's part of the mechanical package – battery arrangement, coil size, coil interchangeability, headphone, Rod - whatever.

Performance? Or today's machines deep enough or is more depth really need it? Is iron ID/elimination enough detection for salt water beach hunting? What about the controls?

Price - what would you pay for detector which met your needs perfectly - was as robust as the CZ 21 - had the depth of the Excalibur or better - could detect targets, even those undrt nails while frjecting nails, fishooks, etc. - and was sensitive to the tiniest gold jewelry?

Since I have contacted First Texas about the Manta previously – I'm in a position to make sure they see whatever we post here in terms of what we want them to build for us!

If you're up for it please post your ideas on this thread I for one will not criticize even the wackiest of them –
           Florida Hospital Will Be First to Test So-Called “Smart” Prescription Bottle That May Offer a Solution to a Cost Issue in U.S. Health Care         

By George F. Indest III, J.D., M.P.A., LL.M., Board Certified by The Florida Bar in Health Law A burgeoning health care tech company, Smrxt, recently closed shop in New York to relocate its headquarters to Orlando, Florida. It has designed a so-called “smart” prescription bottle that will alert physicians and health care companies to missed medication doses. While it’s currently only been used in the prototype phase, Florida Hospital will be the health care pioneer to first try the success of the high tech bottles on a more extensive scale upon further development....

          NY State Grants $622,000 For Transportation Battery And Energy Storage        

New York State Governor Andrew M. Cuomo announced $1.4 million has been awarded to six companies working on new technologies to improve the grid and battery and energy storage. The New York State Energy Research and Development Authority (NYSERDA) said this funding will help develop working prototypes that demonstrate the ability of these advanced energy […]

The post NY State Grants $622,000 For Transportation Battery And Energy Storage appeared first on

          Final Prototype Of NY’s Taxi Of Tomorrow Shown        

New York City’s Taxi of Tomorrow vision is about to be realized in just a few months and the vehicle is getting ready for production. Later this fall, the all-new and completely redesigned taxis are due on the city’s streets. This comes just two years after Nissan won a bid to redesign and supply the […]

The post Final Prototype Of NY’s Taxi Of Tomorrow Shown appeared first on

          Wheel Nostalgia: Mugen M7        

Huge thanks to our friend Russell Laviolette for writing up the following bit of Mugen history! The M7 is truly a unique wheel and we love seeing the passion and time that Russell and others put into restoring and preserving these wheels.



The Mugen M7 has come to be regarded by many as one of the most important and well designed wheels in SpoCom history. But it wasn't always so. Its popularity has increased ten-fold in recent years and prices have followed. Initially designed for the NSX chassis the wheel ranges in diameters from 15" to 17" and widths between 6.5" to 9". The smaller sizes suited the expanding Civic lineup as well as larger Honda platforms like the Integra and Legend. The wheel consists of a billet aluminum face cut in the bowels of some of the earliest CNC machines and assembled using unorthodox aluminum 12pt flange bolts (M6x20). The barrels are spun aluminum and bare a specification decal similar to those seen on late MR-5s. In addition a very unique Mugen decal is present, likely indicating a production sequence. One other bit of information is present on the rear lip of the barrel in the form of a two digit number. Besides this no other markings indicate the mysterious history of this wheel or its date of manufacture. Some sources speculate that Enkei likely lent a strong arm in the production process, but recent research has indicated otherwise. M-Tec (Mugen) has informed me (via another source) that, similar to the MR-5, the M7 was produced by Fortran. Despite this it's possible that even Fortran used another manufacturer as this kind of outsourcing is common in Japanese wheel production. Fortran would discontinue operations by the mid 90's which may explain the short, but plentiful, production run of the M7. Practically the wheel has little usage in competitive circumstances due to its increased weight, but is a beautiful cruising wheel on nearly any Golden Era Honda chassis. I anticipate the wheel's popularity will continue to grow in coming years and further solidify itself as one of the quintessential wheels for any Mugen collector.

*Notice the details of the prototype wheel pictured. It is missing the distinct Mugen emblem on the spoke and likely has a different finish than production versions. 

          Economic and Geo-Political Prognosis for 2015        

Paper No. 5856                                 Dated 12-Jan-2015

Guest Column by Dr. Rajesh Tembarai Krishnamachari and Srividya Kannan Ramachandran


The re-moderation of the world economy set in place over the past few years continues apace. Notwithstanding some lasting damage on the supply side through the 2008 recessionary trough, our outlook for 2015 is bullish weighing more on optimistic data trends than on continued negative sentiment proffered from some analyst quarters.

Around the world in 80 (or more) words:

Treating the ten-year US Treasury bond yield as a proxy indicator for that nation's nominal GDP growth, we anticipate United States to grow around 3% next year.[1] While this does not mark a return to the buoyant 90s, it is better than the secular stagnation hypothesized earlier in 2014.[2] With US acting as an engine to spur growth, the world economy should also expand by more than 3%.[3] Stability across the world will be maintained – as sparks without a concomitant fury will characterize both overt (e.g. Russia-West over Ukraine) and covert (e.g. China-Japan over Senkaku) animosities.[4] European stagnation from debt and unemployment will be counterbalanced through quantitative easing by the European Central Bank.[5] Similar action in Japan will display the limits of Abe-nomics.[6] China will prepare for a structural slowdown emphasizing domestic consumption and de-leveraging an over-heated financial sector; all the while growing at a 7% rate that will amaze rivals around the world.[7] Indian reform, even if inadequate, will boost the middle classes and reinforce confidence in the Modi government.[8] African countries will find their commodity boom dissipate and ease of borrowing decline as commodity prices fall and yields rise in the developed world.[9]

Continental tectonics:

a. North America:

Economic benefits arising from the exploitation of shale gas have not only silenced the anti-fracking environmentalists, they have altered the strategic world-view of Washington politicians.[10] As US aims to overtake even Saudi Arabia in oil/NGL production in 2015 (and the Saudis pull out all stops in preventing it by driving crude prices down), it has markedly reduced its role as a global policeman.[11] Its own economy is on the mend even as a lame-duck president will be boggled down with partisan grid-lock. Markets will fret about the mid-year (or earlier?) hike in interest rates; though Main Street - aided by a strong dollar - will likely shrug it off with a continued upward movement across different sectors.[12]

Mexico and Canada will benefit from their tight coupling with the United States.[13] Enrique Pena Nieto will claim credit for reforming the Mexican economy – across sectors as diverse as energy and telecom.[14] Pemex, dear to the Mexicans, will face some competition, though nothing remotely similar to the American acquisition of Tim Hortons – dear to the Canadians – will happen.[15] Up north, the Canadian elections in 2015 will reveal whether the country has reverted to its liberal propensities or sticks with Harper's conservative agenda.[16]

b. Latin and South America:

The outlook is disappointing across much of the region. Run-away inflation hammers Argentina and Venezuela; milder ill-effects bedevil Brazil, Bolivia and Uruguay.[17] The Maduro regime in Venezuela and the Kirchner government in Argentina continue to flirt with disaster as their GDP growths slip and mass discontent builds up.[18] Dilma Rousseff has stabilized her position electorally, though her policies continue to disappoint investors and have the potential to reignite sudden protests like the 2013 bus-fare protests.[19] Dependence on commodity exports in a time of declining prices does not portend well for any of the South American states, including Brazil.[20] On a positive note, Cuba – already expected by analysts to grow by close to 4% next year – will see a boost to its fortunes accruing from a thaw in relations with US under Obama.[21]

c. Africa:

African nations had a great run in the past few years. This arose not only from the boom in commodity prices but also from the need for yield amongst DM (developed market) investors resulting in investment in both corporate and public African bonds.[22] In 2015, these factors could dissipate which will place pressure on countries like Angola where household spending has risen more than 4000% since the start of the millennium.[23] Ethiopia and Kenya are expected to continue on a robust growth path.[24] Contradictions abound within Africa, and nowhere are they more visible than in Nigeria. While the northern part struggles under the oppression of Boko Haram, the southern part booms under Goodluck Jonathan's president-ship.[25] In neighboring South Sudan, one is reminded of the risk-reward payoff as the nation widely tipped to experience spectacular growth in 2014, got mired in conflict, with the consequent dissipation of growth potential.[26]

American intervention in Libya undermined the Gaddafi-imposed order and has led to a civil war between the Islamist and secularist factions which will hold back that nation in the coming year.[27] A more benign intervention was that of the French in Mali in 2013; we expect more calls for Hollande's assistance in 2015.[28] El Sisi has stabilized Egypt after the Muslim Brotherhood interlude in the post-Mubarak era. Though more brutal than Mubarak, the El Sisi regime is being propped by both the Americans and Saudis, leading us to expect the recent bull run in Egyptian markets to continue.[29] ANC rule in South Africa continues unimpeded. Though atrophied by many scandals, the rule should produce close to 3% growth in the coming year.[30]

d. Middle East:

The region continues to be a cesspool of ethno-sectarian rivalries as the century-old Sykes-Pikot agreement unravels.[31] Recep Erdogan has stabilized Turkey and should reap a growth on par with other emerging economies.[32] Erdogan's external actions driven by AKP's crypto-desire to establish a caliphate will see him prop the Islamic State (IS) just so that it can damage Shia and Kurdish interests; but not enough to threaten his own Sunni hegemonic plans.[33] The Saudi establishment has focused on the removal of the Muslim brotherhood threat; now they will focus on limiting Shia Iranian influence by keeping crude prices low.[34] Western companies made a beeline to Iran in 2014 in hope of an impending thaw; much will depend on the negotiation ability of the Rouhani establishment on the sanction front.[35] Dubai and Israel remain insulated from the turmoil around and could reap the benefit of the uptick in the world economy.[36] The risk of sudden flare-ups like the 2014 Gaza war continue to remain on the Israeli radar.

e. Asia and Australia:

The Asian political scene is remarkably stable with China, Japan and India looking inward to stabilize their economies under the leadership of Xi Jinping, Shinzo Abe and Narendra Modi, respectively. Some events have gone unnoticed by world media – for example, China starts the year of the goat as the world's largest economy when measured in PPP terms and for the first time ever, Chinese outbound investments could exceed those inbound.[37] The establishment of China on the world stage has made Xi stronger than any Chinese leader in recent memory bar Chairman Mao himself. The Abe regime will continue on its reformist route of bringing Japan out of the deflationary zone, while winking at nationalist sentiment calling for a re-interpretation of the country's post-war pacifist role.[38] Down south in India, Modi has surprised both supporters and detractors alike by his middle-path approach to reforming the economy and his zealous interest in foreign policy. While reforming cautiously, he has not removed the populist schemes of the previous government. 2015 will see him act unimpeded by local elections (other than in Bihar) and will prove to be a litmus test of his claims of good governance.[39]

Afghanistan under Ashraf Ghani will face more trouble from Taliban as US adopts the Pakistani classification into good versus bad Taliban.[40] In nearby Pakistan, the wildly popular Imran Khan - with some help, perhaps, from the deep state – will challenge the established parties in their home turfs.[41] In Indonesia, Jake Widodo has come to power with Imran Khan-type support amongst the youth, and he will be hard-pressed to implement his reformist agenda – including reducing fuel subsidies – amidst persistent opposition from entrenched interests.[42] ASEAN will continue to slip on its stated intentions for closer cooperation.[43] Australia will try to balance its strategic partnership with the United States with economic dalliances with the Chinese.[44]

f. Europe and Russia:

Vladimir Putin will be emboldened by the short-term rise in domestic popularity; and hence ignore the longer-term implications of his intervention in Ukraine.[45] Tighter coupling with Kazakhstan and Belarus will not prevent what is likely to be a low-growth and high-inflation year for the Russians.[46] Europe as a whole continues to underperform, and it will be most visible in France and Italy both of whom might record less than 1% growth in GDP. With the Trierweller-Gayet saga behind his back, Francois Hollande will attempt to rein in a deficit running at close to 4% of GDP. Even with help from ECB's quantitative easing program, there is little expectation that Hollande can avoid being the most unpopular leader amongst all western democracies.[47] In Italy, high debt and unemployment – exemplified by the statistic of four-fifths of Italians between the ages of 20-31 living with parents – will hamper any efforts Matteo Renzi might take to pull the economy out of its doldrums.[48]

The Greeks might look forward to a better year, especially when juxtaposed against their recent past. On the back of painful reforms, the Greek economy is widely anticipated to commence its long journey back to health, though there might be recurrent political scares and recalcitrant rumors of a Greek exit.[49] The German government will be buffeted by opposing demands – external calls for a more interventionist role in stabilizing the world economy and internal ones for tempering the same. Cautious progress on the fiscal front will lead to modest GDP growth.[50] Ironically, the European nations with best GDP growth projections are also the ones with the highest exposure to Putin's misadventures, viz. Poland, Latvia and Lithuania.[51]

Sectors and segments:

Having dropped significantly in the past few months, the level of oil prices affects the prospects for many industry sectors in 2015.  Oil is typically expected to revert to the mean because a lower oil price has discernible impact on both supply (by discouraging investment in its production and distribution) and demand (by boosting economic activity) sides.[52] The speed of such mean-reversion remains unclear. Russia, Iran and US shale producers (esp. those who are not based at strategic locations) suffer disproportionally more than the Saudi establishment at current price levels.[53] Lower oil prices will provide a fillip to consumer discretionary industries and airlines; and have an adverse impact on railroad (benefiting from oil transportation) and petrochemical companies. The shale gas boom - apart from increasing housing activity - is also the prime driver behind growth in the US steel and construction material sectors; consequently both the steel and construction sectors will remain susceptible to crude movements.[54]

Low interest rates and low macro-growth prospects will induce companies with excess cash to acquire other companies to report earnings growth. That trend will be apparent in companies transacting in sectors as diverse as healthcare, industrials, semiconductors, software and materials.[55] On another side of investment banks, trading desks will see higher market volatility as major powers pursue divergent paths to monetary policy (e.g. US against EU/Japan).[56] In US, regulatory obligations increasing cost of capital for holding certain securities might lead to decreased broker liquidity.[57] 2015 shall see the big banks grapple with the regulations in Basel III and Volcker; one expects regulatory push towards vanilla deposit-taking and lending to continue.[58] Analysts will hope that stronger balance sheets coupled with a return to profitability lead to increased dividend payout for investors in financial stocks. China will seek to tame its overheated financial sector amidst a structural slowdown[59], and India will see RBI governor Raghuram Rajan continue his battle against political interference in corporate lending.[60] Wealth management services will perform remarkably well not only in China, but also to a lesser extent in US as a rising market creates wealth and a retiring baby-boomer crowd seeks to couple low risk with acceptable return.[61] In the arena of mobile payment, Apple Pay will try to avoid the lackluster performance of earlier attempts like Google Wallet.[62]

Lower gasoline prices and an accompanying increase in disposable income (through wealth creation at the markets, increased home values, reduced unemployment and improved economic activity) creates a positive outlook for the consumer discretionary sector. Companies dealing with organic farming benefit from increased health consciousness; the market for yoga will continue to rise as 2014 saw the UN declare a world yoga day on Modi's initiative.[63] Even as DVDs and Blue-rays fall, digital film subscriptions and on-demand internet steaming will rise to please Hollywood.[64] Bollywood will get over its obsession with INR 100 crore revenues as movies will cross that level more frequently.[65]  With supply level of hotels remaining the same as few years back, revenue per room will rise across the sector.[66] Tighter access to credit continues to hamper the rise in existing house sales, which nevertheless should improve over the past year.[67] Asian apparel manufacturers continue to improve their market share in the fast fashion market.[68]  October 2015 will see Europeans benefit from the eCall service in all their new cars, which allows a car to immediately report details to the base-stations on any accident. New carbon-emission standards also come into force in Europe; even elsewhere the move towards higher efficiency in cars will continue.[69] Widodo will be pleased at the growth in automobile sales in Indonesia, which should exceed those of other major markets.[70] Internet advertising is rising faster than television commercials, though 2015 will still see the latter dominate the former in overall revenue generated.[71] Privacy concerns continue to erode on the social media front.[72] The newspaper industry will see increased number of advertorials re-packaged as "native advertising" by which companies will pay for advertisements to be written as paid newspaper article.[73]

In India, the BJP government is yet to clarify its position on foreign direct investment in retail.[74] Irrespective of its final decision, retail sales should surge sharply upward there as the consummation of pent-up demand of past few years couples with the thriving of 'mall culture' in middle-tier cities. China will also see an increase in retail sales inspite of its investigation in to WalMart.[75] The anti-corruption campaign though will negatively impact luxury good sales as well as those of higher-end automobiles there[76]. A strong dollar will affect US companies with significant operations abroad. Wheat production might match 2014 record volumes in Europe[77]; though more newsprint will probably be devoted to higher prices of cocoa from Ivory Coast.[78] Idiosyncrasies of local markets will shine as Dubai invests in large-scale brick and mortal malls, while Manhattan gets more of its groceries delivered at home steps.[79]

Demand for energy should rise at the same pace as the world GDP next year. Analysts will point at attractive valuations of oil companies.[80] If shale price remains attractive, Sabine Pass in Louisiana will emerge as the first plant in US to export LNG.[81] Four years after the Fukushima incident, Japan will see nuclear reactors back in operation at Sendai.[82]

2014 saw the denizens of the developed world fret about Ebola, breast cancer (through a campaign by actor Angelina Jolie) and ALS (through the ice bucket challenge).[83] Overall, health spending will comfortably outpace the rate of growth of the overall economy. Long-term secular trends driving this are the aging population in the western world (with the population pyramid replaced by a population dome) and an emerging middle class elsewhere with increasing demand for improved access to healthcare.[84] Universal healthcare has been promised for all in India, which should drive up healthcare expenditure by a significant amount there.[85] In 2015, large US companies are mandated under Obama-care to provide insurance to more than 70% of their eligible workforce.[86] Uncertainty on US healthcare reform and debate thereon may cause short-term price volatility. Millennial Development Goals will reviewed by the UN later in the year with a new set of goalposts announced for countries to be met by 2030; different NGOs will campaign vigorously through media to get their pet agendas included in the final list.[87]

Transportation companies will report higher earnings from increased economic activity.[88] Apart from some airlines which have suffered reputation damage through recurring accidents, airline companies will benefit from the reduced oil prices. Defense industry will see robust growth in China, as "Chi-America" remains no more a chimera.[89] Alarmed by this increase, Vietnam with Philippines will move within the US ambit and Australia will seek to join the tripartite naval exercises in the Indian Ocean between US, Japan and India.[90] Tensions in Eastern Europe and the middle-east will favor increases in expenditure across the region. The nationalist government in India will increase defense expenditure sharply even as it moves beyond lip-service on the long-standing issue of indigenization of defense manufacturing.[91]

The mantra of social-local-mobile (SoLoMo in tech jargon) continues to drive the consumer markets division of information technology companies.[92] Expenditure on IT hardware is significantly retarded by the increasing move to cloud computing.[93] The move to cloud computing - along with increasing use of mobile commerce - bodes well for the computer security business.[94] India should see a sharp increase in smart phone adoption; elsewhere tablet computers will rise against laptop and desktops.[95] Embedded systems coupled with rudimentary networking will be marketed as an all-encompassing internet of things as the era of big data continues.[96]  Today, a single family in US places more demands on data flow than the entire planet did a decade back; and even this data rate is expected to increase by a whopping 70% over the next year. Consolidation in the cable sector (e.g Comcast with Time Warner Cable) and the convergence of content with distribution (e.g. AT&T with DirectTV) are two trends that should continue on from 2014.[97] Even as Indians will talk about 3G coverage spanning the nation; Americans will tweet about 4G price warfare and the Chinese will see ZTE unveil a 5G prototype.[98] Facebook will have more users than China has human beings.[99] Analysts will harp about impact of interest-rate hikes on high dividend paying telecom stocks.[100] Apart from the financial industry, telecom will emerge as an industry most impacted by federal regulation across the globe.

The anthropologist Edward Weyer once compared the future to being akin to a "corridor into which we can see only through the light coming from behind".  It is in that sense that we have analyzed the data of the bygone year and tried to extrapolate into the days and months ahead. And when some are falsified - and falsified, some will be - then we shall lay credit for the same at the feet of those responsible - viz. us, the people.

[The authors are based in New York City, and can be contacted through email at and The views represented above are personal and do not in any manner reflect those of the institutions affiliated with the authors.]


[1] See the graph titled "10 year bond yield: annual change and real GDP: annual % change" at

[2] "Secular stagnation: facts, causes and cures", a VoxEU eBook at

[4] A brief historical perspective on the Russia-Ukraine conflict is at

The Economist magazine summarizes the debate over Senkaku islands at

[5] “The ECB, demigods and eurozone quantitative easing” at

[6] “Bank of Japan announces more quantitative easing: the next chapter in Abenomics” at

[7] “World Bank urges China to cut economic growth target to seven percent in 2015, focus on reforms” at

[8] “Reforms by PM Narendra Modi will help India to grow 5.5% this year, 6.3% next year: ADB” at

[10] “The experts: how the US oil boom will change the markets and geopolitics”,

[13] “Economic growth patterns in USA, Canada, Mexico and China” at

[14] “Mexican president Pena Nieto's ratings slip with economic reform” at

[17] “Andres Oppenheimer: Latin America's forecast for 2015: not good” at

[18] “Maduro blames plunging oil prices on US war vs Russia, Venezuela” at and “What's in store for post-Kirchner Argentina” at

[19] “Brazil economists cut 2015 growth forecast to slowest on record” at

[20] “Economic snapshot for Latin America” at

[21] “Cuba, Dominican Republic and Puerto Rico business forecast report Q1 2015” at and “Obama's Cuba move is Florida's top story for 2014” at

[24] “Ethiopia overview” at and “Kenya overview” at

[26] “Internal violence in South Sudan” at!/?marker=33.

[27] “Political instability in Libya” at!/?marker=14.

[28] “The regional impact of the armed conflict and French intervention in Mali” at

[29] “EGX head optimistic on equities as Egyptian economy recovers” at

[30] “Economy - outlook for 2015 dismal, despite boost” at

[31] “Pre-state Israel: The Sykes-Picot agreement” at

[32] “Turkey - economic forecast summary (Nov 2014)” at

[34] “Saudi-Iranian relations since the fall of Saddam” at

[36] “Dubai 2015 cross sector business outlook extremely bullish” at and “Israel - economic forecast summary (Nov 2014)” at

[37] “China's leap forward: overtaking the US as world's biggest economy” at

[38] “Understanding Shinzo Abe and Japanese nationalism” at

[39] Book: “Getting India back on track: an action agenda for reform” edited by B. Debroy, A. J. Tellis and R. Trevor.

[40] “US may not target Mullah Omar after this year" at

[41] “The rise and rise of Kaptaan” at

[42] “Widodo launches reform agenda with fuel price hike” at

[43] “ASEAN's elusive integration” at

[46] “Russia's economics ministry downgrades 2015 oil price forecast to $80 per barrel” at

[47] “Hollande popularity plumbs new low in mid-term French poll” at

          How to prototype and influence people        

          What are Mattel and Google doing with View-Master?        

View-Master Celebrates 65 Years of 3-D Magic

With a View-Master topped teaser (which you can see after the break), Google and Mattel invoked one of our favorite childhood memories -- and frequent inspiration for low-budget virtual reality shenanigans. The two are planning an "exclusive announcement and product debut" ahead of the New York Toy Fair next week, but other than the View-Master theme there's little to go on. Mattel's Fisher-Price division tried a View-Master comeback for the digital age in 2012, although all trace of it is gone now. We'll have to wait until next Friday to see for ourselves what they're planning, but we invite your wildest speculation until then. So what are you thinking -- a plastic pair of branded Mattel VR goggles based on the Cardboard project, or maybe a Hot Wheel based on something else Google has been working on?

          Being a Gamer without Gaming        

Gaming's a large part of my life, but I haven't had the time to play these days. I'm a young guy about to jump into the working world. What used to be a life of college parties peppered with late-night pizza-and-play marathons is being replaced by adult life's responsibilities. I can't complain, but I do miss the high school days where I could stay on the couch, play some Halo (ok, maybe a lot of Halo), and not have to worry about rent.

When the family, the job, and the daily grind prevent you from playing many games, what's a gamer to do?

Plenty! If you play the occasional multiplayer game a few times a week but don't understand how anyone finds time to sink 1,000 hours into the Witcher 3, this blog's for you. If you want to feel like a gamer without playing a single match of CoD, read on.


Thank god for podcasts. I listen to them more often than music these days. Ben Hanson's sultry voice in particular just gets me going in ways Beyonce can't. In all seriousness, podcasts are great for mindless activities. While music might allow your mind to wander, podcasts can still teach you all about the gaming industry while you walk to work or wash the dishes (or in the most recent GI podcast's case, embalming bodies). Podcasts like GI's allow you to get weekly reactions to industry happenings. Similar to following gaming news, an informed gamer is a gamer who knows how to maximize their time and budget for the most fun. I regularly listen to GameInformer Podcast, EasyAllies, and FrameTrap, but IGN and most other major game news outlets have their own podcasts these days. Find one that suits you and stick to it!


If you don't have time to crush newbs in your favorite online game, you can at least fantasize about it by listening to your favorite gaming soundtracks. Music and videogames have had a close relationship since the 8bit days. Tunes that were once confined to a nerd's basement are played by ensemble orchestras to a live audience. Music is an essential part of any game's atmosphere. Listening to a theme on the train can rip your brain from reality and toss you into the game. If you don't have time to sit in front of a console, find yourself some earbuds and jam to your favorite videogame music. Spotify, YouTube Music, Pandora all have videogame libraries available.

Make Stuff

Yes, please do! Even if you don't have the time and energy to play a game, you may have time to draw a video game character doodle between work breaks. You could write a blog, hash out a couple game ideas, or even write up a narrative. If you know some code, make a mechanic. If you know how to use Illustrator, make some pixel art gifs. Creativity comes in all shapes and sizes, figures and forms. You never know when inspiration will hit like a ton of bricks, but acting on that inspiration feels very rewarding when you don't have time to wrap your grubby hands around a controller. Sometimes, you may feel so productive that playing a game feels like a waste of time. If this is you, join a local Game Jam! Tens to thousands come to jams around the world to spend a weekend making a prototype game. It's fun, it's social, and it exercises important hard and soft skills. Search for game jams near you if this sounds like a dream.

Follow the News

Being an informed gamer means you won't waste your money and time on games or consoles; someone's already done so. If you're here, you're probably a GameInformer addict of some stroke and color. But don't follow just one news source! Every news source has biases and misses good stories. Kotaku, Gamespot and IGN are larger news sources to feed your infolust. Click here to see a super-list of 100 'essential' gaming websites to follow. Go one step further by looking at local news sources from journals or blogs near you. Searching "Gaming News in Your Hometown" should reveal local new source sections or even an entire website/blog dedicated to gaming news.

Let's Plays & Live Streams

Yes, if you can't play the game, why not watch someone else play the game? I was a short-time bandwagon fan of PewDiePie and Markiplier back when I still thought these guys were naturally funny and not faking reactions for money. Still, Let's Plays and Live Streams are some of the best ways to see how a game plays without actually playing it. YouTube, Facebook, and Twitch are all large streaming platforms that gamers fuel with rage memes and Doritos. I've played Rocket League off and on for the last year or so, and I gotta say, following Gibbs and watching Live Streams of pro tournaments has actually improved my skill somewhat. Since I whiff the ball a lot less, I enjoy the game a lot more.

So yes, there are ways to be a gamer without playing games. Sometimes, life just gets too real and we don't have time to invest in games. The backlog builds and builds. Until the day comes where we dust off the backlog and burn through it, the activities I mention above are great ways to stay involved in the hobby we all love. When you don't have time to play a game, how do you stay online?

Cheers everyone, and Happy Gaming!

          Man Reportedly Finds an SNES-CD Prototype Console [Video]        
Finding pirate treasure is something that you will probably never be able to do... Finding a prototype of the ill-fated collaboration of Sony's partnership with Nintendo to create a disc peripheral for the SNES dubbed as the Nintendo Playstation? Now possible. Continue reading…
          Making a Cheap Radar Unit Awesome        

[JBeale] squeezed every last drop of performance from a $5 Doppler radar module, and the secrets of that success are half hardware, half firmware, and all hack.

On the hardware side, the first prototype radar horn was made out of cardboard with aluminum foil taped around it. With the concept proven, [JBeale] made a second horn out of thin copper-clad sheets, but reports that the performance is just about the same. The other hardware hack was simply to tack a wire on the radar module’s analog output and add a simple op-amp gain stage, which extended the sensing range well …read more

          Que Du Web 2017 - Archi / Protos / Tech : synthétiser la complexité UX        

Archi / Protos / Tech : comment synthétiser la complexité UX pour vos clients. J’ai pu constater au fil des années qu’il était de plus en plus compliqué/complexe de communiquer la masse et l’ensemble des documents qui constituent un projet web. Plan de site, prototypes, fonctionnalités, personas, résultats de recherche, workflows, wireflows, etc. La liste des documents de références qui suivent un projet est souvent longue et opaque pour les clients. Depuis quelques années, nous avons testé une méthodologie de transmission qui permet aux clients de visualiser en un ensemble, l’architecture du site, les prototypes ainsi que les fonctionnalités liées à certaines pages. Selon les clients, on peut également travailler directement avec eux les wireflows et y attacher les personas. Cette méthode (qui demande beaucoup de post-it et un grand mur) donne aux clients une vision plus « entière » du travail que vous effectuez. Cela permet également de l’impliquer directement dans la construction de l’outil et accélère certaines étapes de validation. Nous continuons à améliorer cette méthode en nous appuyant particulièrement sur les méthodes lean.
          Notes for WikiCite 2017: Wikispecies reference parsing        

Wikispecies logo svgIn preparation for WikiCite 2017 I'm looking more closely at extracting bibliographic information from Wikispecies. The WikiCite project "is a proposal to build a bibliographic database in Wikidata to serve all Wikimedia projects". One reason for doing this is so that each factual statement in WikiData can be linked to evidence for that statement. Practical efforts towards this goal include tools to add details of articles from CrossRef and PubMed straight into Wikidata, and tools to extract citations from Wikipedia (as these are likely to be sources of evidence for statements made in Wikipedia articles).

Wikispecies occupies a rather isoldated spot in the Wiikipedia landscape. Unlike other sites which are essentially comprehensive encyclopedias in different languages, Wikispecies focusses on one domain - taxonomy. In a sense, it's a prototype of Wikidata in that it provides basic facts (who described what species when, and what is the classification of those species) that in principle can be reused by any of the other wikis. However, in practice this doesn't seem to have happened much.

What Wikispecies has become, however, is a crowd-sourced database of the taxonomic literture. For someone like me who is desparately gathering up bibliographic data so that I can extract articles from the Biodiversity Heritage Library (BHL), this is a potential goldmine. But, there's a catch. Unlike, say, the English language Wikipedia which has a single widely-used template for describing a publication, Wikispecies has it's own method of representing articles. It uses a somewhat confusing mix of templates for author names, and then uses barely standardised formatting rules to mark out parts of a publication (such as journal, volume, issue, etc.). Instead of a single template to describe a publication, in Wikispecies a publication my itself be described by a unique template. This has some advantages, in that the same reference can be transcluded into multiple articles (in other words, you enter the bibliographic details once). But this leaves us with many individual templates with multiple, idiosyncratic styles of representing bibliographic data. Some have tried to get the Wikispecies community to adopt the same template as Wikipedia (see e.g., this discussion) but this proposal has met with a lot of resistance. From my perspective as a potential consumer of data, the current situation in Wikispecies is frustrating, but the reality is that the people who create the content get to decide how they structure that content. And understandably, they are less than impressed by requests that might help others (such as data miners) at the expense of making their own work more difficult.

In summary, if I want to make use of Wikispecies I am going to need to develop a set of parsers than can make a reasonable fist of parsing all the myriad citation formats used in Wikispecies (my first attempts are on GitHub). I'm looking at parsing the references and converting them to a more standard format in JSON (I've made some notes on various bibliographic formats in JSON such as BibJSON and CSL-JSON). One outcome of this work will be, I hope, more articles discovered in BHL and hence added to BioStor), and more links to identifiers, which could be fed back into Wikispecies. I also want to explore linking the authors of these papers to identifiers, as already sketched out in The Biodiversity Heritage Library meets Wikidata via Wikispecies: adding author identifiers to BioStor.

          Apple werkt aan verschillende prototypes van AR-brillen        
De augmented reality-plannen blijven waarschijnlijk niet beperkt tot ARKit in iOS 11. Apple zou verschillende soorten AR-headsets testen om te zien welke het beste werkt.
          Swedish FCHouse impresses IKEA        

Hans-Olof Nilsson a retired Swedish engineer from the refrigeration industry decided to go off-grid with his house in a serious way, storing the summer sun as hydrogen to keep warm in the cold Swedish winter. Ikea has shown interest...this is his recipe: you take... An Alkaline electrolyzer delivered by GreenHydrogen in Denmark as a prototype producing 2 Nm³ of hydrogen per hour. It takes 5.5 kWh to produce and store 1 Nm³ of hydrogen with a caloric energy content of 3.3 kWh. To produce 1 cubic meter of hydrogen it takes 1 liter of purified and de-ionized water. From that amount of hydrogen 1.5 kWh of electricity and 1.5 kWh of heat is generated in the fuel cell. The fuel cell heat is integrated in the general heating system of the house. A compressor using 0.5 kWh of the 5.5 kWh necessary to produce and store 1 Nm³ of hydrogen, to compressit to 300 bars. The new and more efficient Metal Hydride Compressor System from Norwegian supplier has no moving parts and works by temperature differentiation.  Annual production from the electrolyzer is roughly 3,000 Nm³ of hydrogen. 2,000 - 2,200 Nm³ of hydrogen will be used by the fuel cell  for room warming, hot water, as well as household electricity needs such as ventilation, washing, cooking and lighting. Charging of electric cars is of course included. There is an estimated surplus of 800 – 1000 Nm³ that Hans-Olof plans to use for a Toyota Mirai (hydrogen fuel cell-electric car) that will run approximately 10,000 kilometers on that amount. Oxygen from the electrolyze process, half the amount of hydrogen, is vented to the outside air. The hydrogen fuel cell is a working prototype by Swedish fuel cell maker PowerCell. It was developed specifically to this house as a mutually beneficial project allowing Hans-Olof to produce the needed heat and electricity in wintertime and creating a massive amount of data and experience for PowerCell. An internet connection provides monitoring and remote control of the unit. It delivers roughly 1.5 kW of electrical power and 1.5 kW heating effect. Two grey tubes on the left bottom side of the unit takes in cooling water and returns 65-70 C hot water. Red and blue cables top left deliver 48 VDC to the earlier illustrated inverter system. An electric effect of 1.5 kW may not seem impressive, yet when it runs 24/7 to charge the batteries there is always enough energy to meet peak demands such as car charging or the heating needs of the house – in fact all functions of the house. A new fuel cell is to be delivered by the same manufacturer and is now a standard product named PS-5. The “5” refers to kW – meaning it will produce 5 kWs of electrical power and 5 kWs of thermal power
          â€œSilverlight on the Silver Screen”: Two Days Away!        
Don’t forget that Silverlight on the Silver Screen, ObjectSharp’s free seminar on Silverlight 3, Expression and SketchFlow takes place in Toronto this Thursday at the Scotiabank Theatre. If you’d like to learn more about the rich-UI applications that you can build with Silverlight 3 and Expression and how quickly you can design and prototype user […]
          Mastering the TurboGears EasyCrudRestController        
One of the key features of TurboGears2 is the great CRUD extension. Mastering the CRUD extension can really make the difference between spending hours or just a few minutes on writing a web app prototype or even a full application. The CRUD extension provides two main features, the CrudRestController which is meant to help creating totally custom CRUDs and the EasyCrudRestController which provides a quick and easy way to create CRUD interfaces. I’ll focus on the EasyCrudRestController as it is the easiest and more productive one, moving forward to the CrudRestController is quite straightforward after you feel confident with the Easy one. The target will be to create, in no more than 40 lines of controller code, a full featured […]
           Edignburgh sets up pop-up village for the homeless         
A prototype of the wooden houses which will shelter Scotland's homeless in a pop-up village has been unveiled today in St Andrews Square in Edinburgh. It has two bedrooms and a kitchen.
          Phaser Coding Tips 1        
Phaser Coding Tips is a free weekly email – subscribe here. Welcome! If you’re anything like me you probably write loads of code. From prototypes demonstrating a single mechanic to helper functions. And sometimes you might even finish a game I started this series as a means to share code with you in an informal […]


So have you ever experienced one of those hee bee/gee bee feelings?  You know the one where the twilight zone plays in the background because something really coincidentally weird happened?  Something that you just can't explain?

Well I decided to take a diversion from sports in this blog and tell you about a crazy thing that happened to me on a recent trip to FLA for a business meeting.  It happened in the Orlando airport on my way back to Providence.  It is a small but randomly pointed coincidence between myself and a stranger that inspired me to write....

As I drove from Daytona to Orlando, I was about a half hour away for the airport when my trusty text alert from Southwest Airlines went off...I get these unnamed texts from time to time usually Unicef or Walmart --how they get my cell I'll never know--but I almost didn't read it. I glanced down at the buzz to see that my 9:30 pm flight to Providence was now changed to 10:40 pm-- so I kicked my own butt because I was already thinking "hey JA--serves you right for booking such a late flight to begin with...what were you thinking?". So obviously I was in no hurry to get to my gate as I dropped my rental, meandered to the ticketing counter struck up a conversation with the woman who ticketed me who ended up hailing from Burrillville, RI (of course she did!), and finally made my way into the long snakelike line that ends at the X-RAY body scanner.  After being tagged as a liquid contraband smuggler by the X-Ray machine attendant...It turned out that my yogurt was the liquid in question...I grabbed my laptop stuffed it into my bag and proceeded to just miss the next tram to the terminal.  It must've taken me too long to dress after the strip search!!

For whatever reason the terminal wasn't crowded so very few people were getting
On the tram.  The next tram that came dumped off a hoard of people and the few lone LATE night travelers entered.   I headed to the first car of the tram.  I was the only one to enter. The rest of the embarkers got on the last car, which was the closest to the security area--I had chosen to burn the calories and walk the extra 10 yards to the front!!   As I enter the tram all by my lonesome, at the front of tram in the big window a Swiss Army black laptop case sat solo with no owner in sight.  For a split second I can hear in my head the voice of the airport police saying "never leave your luggage unattended...". After all aren't we conditioned since 911 to think everyone is a terrorist?  I half think that the second I picked up the bag it would self detonate like one in an NCIS Los Angeles episode and I would go out of this life in a ball of fire!  Then the realist in me sees in my minds' eye a mom and dad anxious to get to Disney with the kids.  As the doors fly open one of the kids darts out and they are so flustered they leave the laptop bag on the front seat of the tram chasing the kids out the door--I actually think that happened to me once!!

So the Good Samaritan with lots of time on her hands ME-grabs the bag, walks it back to security and turns it in.  I have to tell you, a part of me expected to get a lecture from TSA to call 911 so the bomb squad was called in as they proceed to interrogate me under the hot lights in the Airport Jail like Gaylord Focker in Meet the Parents.  In reality, a pleasant airport security gentleman thanks me "ma'am" and I am on my way back to the tram.  This time the opposite side tram comes first...I am a creature of habit and I walk to the front car again.  It is again empty--just me...I board the tram and imagine I am back 10 years with my kids again heading into Disney.  How cool is the tram the first time you see it!? 

Anyway the 30 sec tram rides ends and a woman appears in the doorway.  I am passing her and just as I am about to pass her she looks at me and says did you see a bag on the tram?  I say, "Yes, a black Swiss Army bag!  As a matter of fact I found it and returned it to TSA--it is at the desk."  Relieved, she thanks me and hurries onto the tram as the doors close.  At that point a wave of emotion wells up in me at the magnitude of what just happened.  It was a twighlight zone moment!  Rod Serling was about to come out of the broom closet and launch into a sermon on the odds of this happening and why two strangers happened to bump into each other on the Disney that exact time it also happened in an alternate alien universe! And the music plays---doo doo doo do...

Here is why I think this happenstance is so crazy:  number one if I hadn't been ID'd as a yogurt smuggler I would've taken a different tram.  Number two-the tram I came off was not the one the bag was left on. It was on the opposite side.  The woman didn't even remember the right tram she took.  Number three I got into the front car again.  What were the odds of that happening?  I guess pretty good being the creature of habit that I am--but COME ON!  And number four something made the lady ask me even though I had already walked past her.

These experiences happen a lot to me.  And I NOTICE that they happen.  I am sure that these types of things happen to lots of people.  Sometimes they may happen in numerous occurrences but go unnoticed...I believe that energy, sensitivity, and the power of people is special and that we haven't even tapped the potential that we have as a society to understand each other on levels that we have not yet begun to imagine.

It makes me hopeful that our connections as human beings will continue to grow in a positive way to make the world a better place.


           Tesla is developing an electric self-driving truck         
Emails between Tesla and the Nevada Department of Motor Vehicles have revealed a prototype autonomous electric truck that will drive itself and move in 'platoons' could soon be tested in the state
          iPhone SE 2 backplate prototype surfaces        
iPhone SE 2 backplate prototype surfaces

Single-lens camera design similar to iPhone 7

Apple is expected to release a second-generation refresh of its popular iPhone SE, which has enjoyed robust sales in the low-end market worldwide and has now cut into the sales of higher-priced iPhone units in the US.

An image surfaced earlier this week from a Weibo user showing what appears to be the rear panels of several redesigned iPhone SE units. While there is no way to verify the authenticity of the source, the printed parts sheet in the background says the panels are made of Ion-X glass. This is the same material Apple uses in the Apple Watch, which is very similar to Gorilla Glass found in many smartphone displays.

weibo iphone se n79 prototype


The parts sheet also lists the codename as N79. Apple has a tendency to begin its iPhone codenames with “N” followed by a two-digit number. For instance, the iPhone 6 was N61, iPhone 6 Plus was N56, iPhone 6S was N71 and iPhone 6S Plus was N66. More recently, however, the company switched to using nearby locations as codenames for iPhone 7 (Sonora) and iPhone 7 Plus (Dos Palos), similar to what it does with Mac OS X releases.

Design similar to iPhone 7 suggests 4.7-inch display

Based on the back-panel prototypes, the purported iPhone SE refresh will include a ridge-less cut out for a single-lens camera with an LED flash sensor right beneath. Due to the Ion-X glass construction similar to an Apple Watch, others have suggested that the material is ready to support wireless charging and this may be a feature included with the handsets this year.

Overall, the design appears very similar to the 4.7-inch iPhone 7, which could indicate a panel size upgrade from the existing iPhone SE’s 4-inch screen. As reporters have mentioned over the past year, the size of Apple’s entry-level device is the least of its worries – the current base price of $399 for the 16GB version is simply too high for non-subsidized pricing in developing areas. While customers have the option of paying for a carrier-locked model on a 2-year contract, fewer of them are going to be willing to upgrade at the same frequency as in years past unless the base price drops substantially.

Device may arrive in 2H 2017

Last year, noted KGI securities analyst Ming-Chi Kuo suggested that Apple will not release a second-generation iPhone SE in the first half of 2017. In other words, he suggests a release will  not happen before the end of June. Other sources are more apt to conclude that the low-end device could release simultaneously with flagship iPhone hardware later this fall.

          AMD has working Malta 7990 cards        


Cebit 2013: Still not ready to show them

We have heard that AMD was mulling whether to show off the dual-GPU card known as Malta at Cebit and in the end it didn’t. Partners are telling us that they still haven’t got samples that they can show to the public.

Many have confirmed the existence of Radeon HD 7990 dual-chip successor, as well as the Malta codename and only thing that we know is that it is realistic to expect the card in Q2 2013, so from April onwards. At this time there is no set date for the launch.

It has been a while since AMD launched the Radeon 7990, and it was carried by a limited number of partners. Now we hear that Malta will end up with more than just three partners. It will definitely be more reasonably priced than the Asus Ares II that sets you back $1500 or €1250 and it seems like €999 is  the target, although we don’t have the exact figure yet.

The prototype Malta cards work, but we hear that AMD simply didn’t want to show any, at least not to general public at this time. 

          Latest OUYA update says progress continues        

ouya logo

First factory-made prototype close to reality

The latest information from the land of OUYA indicates that progress is still being made on a number of fronts. The biggest news is that OUYA is on track to deliver the Developer SDK in December as projected. OUYA has indicated that a large number of developers have contacted the company with games in all states of development and developers eager to produce games for the OUYA.

OUYA says that orders continue to tumble in each day from the OUYA.TV web site, which is encouraging as the audience for the console continues to grow even bigger. Speaking of the console, OUYA just recently returned from Taiwan and Hong Kong where they met with manufacturers and suppliers. A manufacturer has been chosen and finalization of the circuit-board layout and the colors, materials, and finish are proceeding.

Testing of the controller prototype continues, and based on feedback the precision D-Pad and four triggers will be part of the design. Of course, the O-U-Y-A button design inspired by feedback will make the cut. Testing each button, trigger, grip and stick continues and, of course, they are getting important feedback from developers, as well.

Kickstarters were informed that a survey will be arriving toward the end of this month that will allow players to choose and reserve their usernames, which can be up to 16 characters in length. Of course, these usernames will be first come, first served, so you might want to be thinking of them now.

Things look to be on track, and once the first factory prototype is manufactured we expect they will have a much better idea if things will continue to stay on track to make the timelines the company has set.

The best news is that interest remains high for the OUYA and even retailers are paying attention.

          Mozilla shows off iOS browser prototype        


Dubbed Junior

Mozilla has demoed a prototype of its browser for Apple’s iOS and, although still at early stage of development, it had a few interesting things to show.

Note that Apple’s restrictions prevented Mozilla from making a native port of Firefox, which was not the case with Android. Indeed, the prototype looks nothing like Firefox and the company will build it on WebKit HTML rendering engine, rather than Gecko.



Junior looks quite nice and the company has a few simple yet effective solutions in mind. It aims for a sleek, full screen interface, which should make navigation a breeze.


The browser combines tabs and history into one and the screenshots shows just how. The topmost tiles are basically tabs the while links/history tabs are below.


Although ignoring Jobs’ mob sounds fun, it would be a terrible business move and the Mozilla knows that. The company does have a long way to go, but judging by some of the initial ideas, they’re on the right track to make Junior a senior.

You can find out more by checking out the presentation here.

          Kinect built into laptops?        


Asus said to be showing prototype

Sources are claiming that Asus has a new laptop prototype that has Microsoft’s Kinect technology built right into the unit. The prototype is said to feature the Kinect sensors above the screen in the same general area where the web cam normally goes.

Microsoft has sparked more speculation of the use of the Kinect technology outside of the Xbox with the release of a development kit for Windows, and news that the company would be licensing the technology. Originally, it was said that the licensing would be for television manufacturers wanting to incorporate Kinect into their television designs, but with the news that Microsoft would be releasing a Kinect for PC package for $250 for PC later this year, it would seem that laptops with the technology built in are also coming.

While our sources have confirmed that not only Asus is working with incorporating the technology into their designs, the actual release of their prototypes remains an unknown. We suspect that manufacturers and designs will continue to evaluate the technology with prototypes to decide if this is a direction that makes sense. With rumored available native support in Windows 8, the eventual release by multiple manufacturers seems to be more than just wishful thinking.

Read more here.


          iPhone 4 prototype selling on eBay        

Auction mimics Apple’s pricing policies 
Apparently, an iPhone 4 prototype made its way to eBay and has managed to hit quite a hefty price, which we fear will only give Apple’s overlord new pricing ideas.

The device is said to feature a label reading ‘DF1692’ in the lower right corner and another one in the back that says ‘XXGB’. However, it cannot be activated through iTunes and the user didn’t have much luck with AT&T’s SIM card either. The serial number is not recognized by Apple but the SIM tray and IMEI numbers confirm the device is a tester.

We checked on eBay and the last bid stands at $1,700. It is a bit strange seeing the prototype outprice the real deal, but we've grown to expect absurdities from Apple fanboys.

More here.

          Apple said to be working on TV        

Building a Smart-TV prototype
A number of sources tell us that Apple is looking to move beyond Apple TV, and they are apparently working on what is being called a “Smart TV” prototype device that will combine a number of features to create a totally new and unique device.

The new Apple Smart TV device is likely to include some combination of TV content from a number of sources, including input from OTA or Cable Card and, of course, DVR functionality. It will likely include a number of apps for use on the TV, including FaceTime and web browser integration similar perhaps to what we have seen on the Google TV offering. Of course, the device will likely support gaming of some type, as well. It will offer a combination remote and keyboard in one.

It is hard to know all that it will offer, as it is still being defined, and right now they are still in the development stage. We do know that it is likely ARM-based, using the latest Apple dual core design running the latest version of iOS. It is likely that it will offer some sort of iPhone/iPad/iPod Touch docking support. It will offer several streaming options and some other programming.

Look for it to arrive with an official announcement around the holidays, according to the whispers we are hearing. Then again, it might amount to nothing and Apple may never bring the product to market.

          AMD has Buldozer CPU prototypes in house        

Sampling partners in Q4 2010
At its analyst conference call in November, AMD might get ready to talk about some more exact dates on the launch date of its Bulldozer architecture. At this time, AMD is only telling the world that Bulldozer will launch in 2011 but John Fruehe, AMD’s director of servers has said that partners should get samples in Q4 2010.

In November time we will get something better than just shipping in 2011. Our sources are implying that Bulldozer prototypes are in quite good shape and that despite the rumours that see it being shipped in the second half of 2011, we actually might see its first Opteron server chips as soon as first half of 2011.

Mr Fruehe did confirm that AMD has been playing with chips in house and he said that they are happy with what they are seeing. He implies that partners should be excited once they get the chips in Q4 2010.

AMD can just hope that all will go well, as it has some serious market share to regain.

          One Education’s Infinity modular laptop/tablet hits Indiegogo        
It was way back in February that the One Laptop Per Child (OLPC) initiative and its partner One Education announced a modular laptop/tablet hybrid, dubbed the XO-Infinity, promising a debut in the following weeks. Well, it’s a bit later than expected, but One Education finally has working prototypes, and, after dropping the XO part of the name, has now launched … Continue reading
          MeshCon'14 in der Abendschau Berlin        

Abendschau über die MeshCon Fashion and Tec Week Berlin 2014. Softwareapplikationen für die Modeherstellung, Wlanchips für Kleidung und Prototypen von Strickmaschinen.



FashionTec Working Group:

FashionTec Meetups Berlin:

Tanja Mühlhans Ansprechpartner | Projekt Zukunft Berlin

Tanja Mühlhans bei der MeshCon’14

          Beating the Clock: Audi Outlasts Competition at 2014 Le Mans Race        

Sparing you the suspense of reading the entire article to find out the outcome of the 2014 24 Hours of Le Mans, the race was once more another highpoint to the legacy of Audi’s prototype sports car racing program. Taking first place as well as second place on the overall leaderboard after a day of […]

The post appeared first on Beating the Clock: Audi Outlasts Competition at 2014 Le Mans Race

          New Look Audi Prototype Takes to the Streets of Le Mans        

In less than three months, the 24 Hours of Le Mans should be an event providing the best fight in the prototype class in years. Provided through the 2014 FIA World Endurance Championship, the Le Mans race is also the pinnacle for which all teams want to win. The re-entry of Porsche’s new 919 Hybrid, Toyota’s […]

The post appeared first on New Look Audi Prototype Takes to the Streets of Le Mans

          CREATIVE ENTREPRENEURSHIP LECTURES: The Complete Guide to Successful Product Development        
Wonder Mzileni will be giving a series of talks in the months of October and November at several of South Africa's top universities entitled " The Complete Guide to Successful Product Development."

As an innovative approach to speaker-audience interaction I have decided to create this blog post to serve as an auxiliary tool for the audience to effectively assimilate and execute the methods of product development that we, at Industrial Arts follow.

As an added bonus, those who were not be able to attend the talk can still benefit from the topics discussed. The venues and locations of the talks will be: Cape Town-UCT; Johannesburg- Wits University; and Pretoria- Tshwane University of Technology. For more information on specific events and to register for talk, please email your query to

Product Developers Guide: Introduction
To make the product development journey smoother, the IndustrialArts Consultancy team has put together the creative entrepreneurship lectures entitled: "The Complete Guide to Successful Product Development." With an introduction to research and  development, patents, confidentiality  agreements and costs, as well as advice  about seeking outside investment, licensing and marketing, the Guide helps you  plan and assess how to profit from your idea.

Developing a product can be a thrilling experience. Unfortunately it can also be a costly and very long process and the reality is that most ideas don’t make it to become successful products. For a specialist product design and development consultancy like The IndustrialArts, solving problems and turning ideas into reality is something we do every day, but how do you make sure the product is a success? Just making your idea isn’t enough. You need to make sure it will make you money.

To help you take the first steps with your product we have created this guide to break the process down into five key steps:
  • Assessing your idea
  • Protecting your idea
  • Developing your idea
  • Selling your idea
  • Funding your idea
PART 1: Assessing The Idea
This is the starting point for any new invention and in many cases also the end. Not all inventions have the potential to be commercially successful and there are numerous reasons why people decide not to pursue the idea. This is the stage where you find out if your product has already been thought of, whether the product is viable in a market and also if there is a market big enough that may need your product.

There are a few questions you have to ask yourself:

Has your invention already been thought of?Frequently inventors have ideas that already exist, either as products or as patents in which case it is unlikely to be worth pursuing. In many cases a similar but not identical product or patent will exist, in which case you need to be sure your idea offers a big advantage. Thoroughly searching the internet and patent databases (see details later) is time well spent.

Who would use your product?
Is your product aimed at consumers or businesses? Is the market the whole population; is it national, gender specific or aimed at a specific demographic? It is useful to imagine a typical or range of typical uses. If you have close, trustworthy friends and family who fit this profile you can get their opinion but beware influencing their reaction. You need to be sure they aren’t just saying they like it because they don’t want to disappoint you.

What are the reasons people will buy your product?
What problems does it solve that aren’t solved by existing products or what advantages does it give? People have managed without the product until now, so why should they suddenly need or want it?

What is the potential size of the market?
This can be difficult to estimate but data does exist for sales volumes of most consumer product categories be it soft drinks, electrical appliances, furniture or medical devices. If such data is not available you may need to make an initial estimate from other sources, for example if you are designing a new baby toy you might use census data on the number of babies born each year to estimate how many such products are bought. When estimating the potential sales volumes be conservative - few new products will gain a market share of more than 5% unless they are truly revolutionary.

Can you protect your idea?
This is important as without any competitor company can copy your idea and use their market position and financial muscle to squeeze you out. The next section covers how to do this with patents and design registration.

What are the risks and difficulties you will face?
Developing, launching and profiting from your invention will be a long and complex process and problems can occur at any stage. Things to think about include; can you get the funding to develop it? Does the product require new technology to be developed? If so could there be problems or delays? Can you make the product for the target cost? Can you find distributors? Will consumers like it? Will retailers stock it? How will it be marketed?

What will the development and set up cost be?
There are many stages to developing a new product as explained in the Developing your idea section. The amount of work involved can vary dramatically from product to product. As well as the design and engineering costs other costs will include, prototypes, production moulds and set up costs, approvals and testing, initial production, shipping and storage. For a simple product these will be a few tens of thousands of pounds. For a complex product such as a medical device these can run into hundreds of thousands and even millions of pounds.

What is the target product cost and profit margin?
What will each product cost to make and what price can it be sold for? Most products are distributed through a supply chain with manufacturers, brand owners and retailers all handling the product. Understanding the pricing and margin each party needs to make is key to success. Many retailers will look to buy at half the consumer price or sometimes less. If you can’t offer them the profit level they need at the retail price they think will sell then they won’t stock the product. Do your research on the sector and retailers relevant to your product. What is the price of products of a similar size, type and complexity? Contact factories and retail buyers for initial price estimates.

What is the potential return on investment?
With information on the possible size of the market and profit margin you can make a very rough estimate of how much money the invention could make. Is it enough to pay back the development, set up, patent, distribution, administration, marketing and other costs? How long will the investment take to pay back? What is the best case and worst case? Does that make a good investment for you or an outside investor?

Finding the answers to these questions make take a bit of searching but will tell you whether the invention is worth pursuing. This process should be thorough but should not involve spending a lot of money. It may seem like a lot of bother but many of the answers to these questions can form part of your business plan (see later).

It is important to be unbiased and dispassionate in this analysis it is easy to get so emotionally attached to your ideas that you ignore the problems and weaknesses of the invention. Remember most successful inventors have many ideas before they find the true winner. Deciding not to pursue an idea, which doesn’t add up, is a positive result &ndash it is better to save your money and energy for the great idea still to come.

PART 2: Protecting The Idea
For the small inventor, protecting your ideas is critically important. If you don’t have some form of protection there is nothing to stop a large competitor taking you idea and using their financial and marketing muscle to squeeze you out of the market. Patents are generally the best way to protect your ideas if they can be obtained but other protections such as registered designs, or trademarks and copyright also exist.

During the process of developing your idea it will be necessary to speak to individuals and companies who may become partners, investors, suppliers or customers. It is vital that this is done under confidentiality. A Confidentiality Agreement or Non Disclosure Agreement is the best way to do this.

There are three general requirements for your invention to be patentable. It must be novel, useful and not obvious. To count as novel your invention must be new, in other words; never publicly known, used, or sold anywhere in the world. If your invention is described in printed material or available to buy anywhere in the world it is not patentable. To satisfy the criteria for utility your invention must be useful. It must perform a function and benefit society in some way. Anything that does not perform cannot be patented. To avoid being classed as obvious, your invention must give new results and should not be just an incremental improvement over existing products. To be different in design alone will not get your invention patented.

It is advisable to use registered patent attorney to help understand whether you can patent your invention, what it would cost and the level of protection it would give. They can also help you draft you application and manage all aspects of maintaining and enforcing your patent rights. However before you instruct an agent and start incurring costs it is advisable to do some online patent research on your own. Using the various free online patent databases such as USPTO, Espacenet Google Patents, etc., it is possible to find many relevant patents and documents and gain a good idea about whether you idea is novel or not. If after your search, things still look positive or you are unsure what your search results mean, then taking your results to a professional patent attorney will help them get straight up to speed and reduce their search fees.

Once you have worked out that your invention is likely to be patentable there are a number of decisions to be made such as what, where and when to patent. Patenting your product in multiple territories is time consuming and expensive so it is important to think carefully before making your application.

What to patent?

How you draft your patent what you claim as your invention can have a big impact on the future earning potential of the idea. If your claims are too narrow and specific, competitors will be able find ways of avoiding infringement by making small changes. Conversely if your claims are too broad, the patent examiners may judge it is prior art (technology and inventions already public) or not inventive. A good patent agent will help you draft your application to get this important balance right.

When to patent?

There are generally two approaches to timing your patent application either to patent as early as possible, or as late as possible.

Patenting early means you are less likely to get beaten by a competitor registering the same or similar idea. If you know people are working in the same field, this could be important. The disadvantage of patenting early is that you have to pay fees for application, searches and examinations much sooner and potentially well before the product has been launched.

By contrast, making your patent application at the last possible moment, before the product is made public means money is saved. Because protection is given from the date of the first application, and one year is generally allowed for the search and examination process, you get at least one year of worldwide protection without having to invest heavily in applications, translations and patents agents for every territory. This can allow you to judge the success of the product, the potential in different markets and start potentially start earning profits before having to spend large amounts on patent protection. Another advantage is that your patent application can be based around a fully resolved and detailed product allowing it to be stronger and broader.

Other Protection
If it is not possible to attain a patent or if you want further protect you idea, there are other ways for you to protect your intellectual property (IP)

Registered Design
A registered design can protect the aesthetic design of your invention. It is purely about the form and materials and does not cover how the product works or is manufactured. Design Registration is a relatively cheap, quick and easy process but is only available in certain countries, including the UK.

Trademarks are symbols that distinguish goods and services in the market place like logos and brand names. If your product has a certain name or logo that you want associated with it, it’s worth getting it trademarked so that no one else can take the name and logo.

Copyright is an automatic right, which applies when the work is fixed – in other words, when it is created or written down. It covers the printed documents, text, journalistic and literary works, images, music and video recordings. It has limited applicability to products but offer some protection of your work.

PART 3: Developing The Idea
Undertaking a new product development requires a whole series of tasks to be performed. Typically the product design and development process is split into phases, each with distinct activities and deliverables. Every development has its unique challenges and so the order and breakdown of these phases should be arranged to address the key challenges and minimise the risk of development. Many inventors will look to license their invention and so the aim of the design process is not to get a product into production but to get an investor or licensee on board.

The Full New Product Development Process
Details of the all the key stages of the product development process are given below and also on the services section of IDC’s website.

User Insight
Researching and understanding the needs and behaviour of the potential users ensures that people will connect with the product - a key requirement for success in the market. Insights gained in the research stage often lead to new product innovations.

Concept Generation
This is where the skill, experience and creativity of the design team are used to generate designs, which address the identified needs and come together to create a vision for the successful desirable product. Design concepts can be presented as sketches, storyboards, simple models or fully photo-rendered images.

For technology led inventions, research and experimentation is a vital part of the product development process. Developing test rigs and experiments helps gain understanding of how to apply scientific and technical principles to create an inventive new product.

Patent Support
The process of developing an innovative new product needs to go hand in hand with the patent process to ensure as many of the important features of the product are protected and that competitor patents are avoided.

Engineering Design
To ensure that the design is translated into a successful product suitable for high quality, cost effective manufacture, this stage of the development defines all the details of every part needed to make the product. A detailed understanding of design for manufacture and assembly and an close attention to detail is a must in this stage.

Electronics Design
Working alongside the mechanical engineering design, the electronics engineers design the circuits, lay out PCBs, write software and ensure all details of the electronics are suitable for manufacture and interface with the mechanical parts of the design.

Prototypes are vital for any new product. They not only provide confidence they design is correct and the product will work, they can also be used for marketing purposes. For inventors a good prototype can be the key to securing a investment or a licence agreement. See also the section below on designing for investment below.

Regulatory Support
Many products have to conform to regulatory standards governing safety, function and performance. This will require product to be thoroughly tested prior to release. The requirements of the standard should be identified at the beginning of the development to avoid difficulties later on.

Project Management and Support
The development project manager coordinates the efforts of the whole development team as well as all external suppliers (e.g. mould makers, manufacturers, shipping companies, marketing agencies, test houses, patent attorneys, etc.) to ensure the product is launched successfully. Ensuring product quality and managing the transition to manufacture is one of the key project management tasks.

Development for Investment or Licensing
Frequently for an inventor, the purpose of employing a product design company is not to undertake the full development but to develop the concept to a stage that it can prove the viability and commercial potential of the product to licensees and investors. The two key requirements at this stage are to prove the viability of the invention and to create a vision for the successful production product. This can be done with sketches and illustrations but will be more convincing with ’Looks Like, Works Like’ models. Making one model, which looks and works like the finished product can involve doing much of the design and engineering development so often two models are made.

Visual Model
Having a clear vision for your product is important when trying to sell the benefits of your invention. A high quality aesthetic model illustrating what the product would look like is one of the best ways to communicate that vision and get potential investors and buyers excited about your idea.

Functional Model
The ’Works Like’ model is made to prove that it is technically possible to build your invention. Potential investors and buyers will respond more positively if they can physically see and touch what you are trying to sell to them. The functional model does not have to be particularly elegant in design, but your model will have to represent what the invention can do and that it can actually do it.

PART 4: Selling The Idea
Being able to sell the benefits of your invention is crucial to success. Think hard about how to communicate the benefits of your idea to consumers and buyers. Selling is also important if you are looking for licensees or investors. You have to be able to convince them that your idea makes sense to invest in and that you will in turn provide them with a fantastic return on investment. A good, well-thought out and structured business plan is the key to making investors want to invest in you, your product and your business venture.

Business Plan
Your business plan is the blueprint to how you are going to make money from your invention. Writing a business plan will help you plan out your business, pre-empt some of the difficulties and, crucially, estimate the capital required and the likely return on investment for your product.

There are many books and websites dedicated to writing business plans but the basic should be:

Executive summary – about you product, patent and the business opportunity
Marketing research – analysis of the market potential, competitors, pricing etc
Commercialisation plan – detail of how will you profit from the invention, e.g. licensing,
Sales projections – estimates for pricing, sales volumes and profit margins
Manufacturing – plans, costs and timescales for development and production set up
Management team – profiles for your management team
Financial statements – projections for cash flow, return on investment and profit and loss

There are many different ways to profit from a great invention. It is important to think carefully to choosing the best one for you and your investors.

Selling your PatentIt may be possible to sell your patent to another person or company and receive a one-off payment for it. This will mean you will lose the rights to your invention and not receive any future royalties so if the product is a huge success you will not see the benefit. On the plus side you will potentially get cash in hand without risking much of your own capital if the product fails you still have your money.

Licensing your PatentAnother route to commercialise your invention is to licence it. This means that you will retain the ownership of whilst allowing another party to make, use and sell the invention in exchange for royalty payments. The license may be exclusive or to more than one party. You may grant a worldwide license or agree deals territory by territory. You may also choose to issue licenses for specific uses or industries. Use industry and product directories and registers to identify potential licensees.

Start your Own BusinessYou can of course launch your own company, which will sell the product. This means that the entire responsibility of fundraising, manufacturing, selling as well as distribution, is on you. You will have to come up with a business and sales plan and conduct a market survey to make sure that your start up company will be able to fully exploit the opportunity.

PART 5: Funding The Idea
You may be fortunate enough to be able to fund the project yourself, however, most inventors need financial backing from an outside source. Getting other people on board can also be helpful as they will have the skills and experience that can aid in the development of your invention. Finding funding can be one of the greatest challenges in the entire inventing process and it is impossible for us to go into great detail. However, we do have a few tips that may help.

You must research different funding strategies thoroughly; there are many companies and government organisations that can help you with funding as well as putting you in contact with the right people.

Finding Partners
It is always best to have several money-raising options as an entrepreneurial inventor. One of these options is finding financial partners, either an individual, a group or a firm that may have a vested interest in your product. You can provide the invention and your partners may bring the rest of what you need to the table. Your partner can provide whatever they do bring to the team for free, in exchange for a percentage of any future profits. There are a few different ways to find partners; you can place an advert in a newspaper or magazine, contact an investors organization or approach a bank and some other traditional financial sources who may be willing to finance your idea.

Venture Capital
Another way to find capital is to find a Venture Capital Firm. This firm will invest in your growing company and will give you the financial backing you need in order to advertise, do research, build an infrastructure and develop the product. The Venture Capital Firm will make its money by owning a stake in your company. Financial agents who work for a finder’s fee may be able to help you find a Venture Capital Firm that is willing to invest in your company.

Local and Government Programmes
Local, state and government programmes that provide businesses with financial assistance are another option to finance your invention and company. Try to research small business funding and angel network programs. Another option is to research government grants.

PART 6: Product Developers Check List
The following checklist is a general guide to help you track your progress. Should you need more clarity on a certain issue, please refer to the Inventor’s guide handbook for more detailed advice.
  1. Have you researched the web and patent databases to make sure that your invention hasn’t already been thought of?
  2. Who is your target user and why will they buy your product rather than the existing or competitor products?
  3. Have you conducted market research to estimated the potential size of the market and understand the distribution chain?
  4. Are you sure your invention is patentable?
  5. Have you estimated how much investment is needed for development and set up?
  6. What will be the estimated cost of manufacture per unit and your predicted sales price?
  7. Is their sufficient return on investment to interest investors?
  8. Have you written a business plan?
  9. Do you have a model or prototype to demonstrate the viability and benefits of your invention?
  10. Will you sell or licence the IP or manufacture and sell the product?
  11. Have you secured funding or investment?
  12. Do you have the right team of individuals and partner companies to deliver your plan?
If you have good answers to all these questions you will be well on the way to making your invention a reality
          Farmhouse Wire Basket, Mud Room, Kitchen, Bathroom, Vintage Baskets, Office Supplies, Office Storage, French Chicken Wire Basket, Kitchen by AbundantHaven        

29.00 USD


14" x 8.5" at Top Opening
6 " Deep

Oval Chickenwire basket is perfect for stashing rolled towels and soaps in the bathroom, scarves and hats in the mud room or office supplies and mail. Endless possibilities!

Fabric Options:
1. French Document in Charcoal or Red
2. Rustic Hopsack Cloth

Wire Basket Finish Options:
1. Black
2. Brushed Copper

Shaped metal handles with coiled detail can be positioned up or down.
Liner has rustic twine ties.

An Abundant Haven key charm is attached to the back label...a reminder to "open every door" to peace and abundance in your home.

**Liner fabric may also be customized to match your decor (please convo me for more details).


(Diane, May 2016)
I made a special request for a basket that wasn't even available, and Susan made it a reality! If you are like me, and love to have even your most mundane belongings (like my mini laundry basket) to be not just utilitarian, but pretty as well, you have just found your new favourite Etsy shop. I am just so pleased with Abundant Haven and my little prototype basket! Merci! ??

(Becki, April 2014)
Thank you for the beautiful baskets! They are exactly what I wanted!! Thank you also for your graciousness! Definitely shopping with you again soon.

(Lauren, July 2012)
So perfect!!! I can't gush enough!! and so awesome to work with! Very attentive! Thank you!

(Mackenzie, July 2012
So nice! It will work great as a card basket for our wedding!

          AngularJS custom directive with two-way binding using NgModelController        

It took me a while, but I finally got it right!
I recently tried to create a custom directive with 2 way binding, using the 'ng-model' attribute. This was a little tricky at first - I ran into some articles, but they didn't seem to work for me, and I needed to make some tweaks to get it right.

I don't want to go over everything I read, but just want to publish the changes or gotcha's you should know about.

The best article I read on the subject is this one :
I recommend reading it. It has the best explanation about how '$formatters' and '$parsers' work, and what's their relation to the ngModelController.

After reading that article, there are 2 problems I ran into.

1. ngModelController.$parsers and ngModelController.$formatters are arrays, but 'pushing' my custom function to the end of the array didn't work for me. When changing the model, it never got invoked. To make this work, I needed to push it in the beginning of the array, using the Array.prototype.unshift method.

2. The second problem I had was that I needed to pass ng-model an object. Passing it a value won't work. You might be thinking that it's obvious since passing a value won't suffice as a reference to it, but this wasn't obvious to me, since passing ng-model a value when using an 'input' element for example works and still updates it both ways.

For a full working example of a two-way binding directive using ngModelController (the ng-model attribute), you can take a look at this:

          Reviewing Kibana 4's client side code        

I haven't written anything technical for a while, and that's mainly because the past year I changed jobs a few times. After working at Sears Israel for almost 3 years, I thought it's time to find the next adventure. I think I finally found a good match for me, and I'll probably write a whole post about that soon.

For now, I'll just say that at the new startup I work at, we're doing a lot of work on the ELK stack, and I got to do a lot of work on Kibana. With years of experience on various client side applications, I still learned a lot from looking at kibana's code. I think there are many things here written really elegantly, so I wanted to point them out in a concentrated post on the subject. Also, there are some bad notes, mainly minor things (in my opinion) that I will mention as well.

At First Glance

Kibana 4 is a large AngularJS application. The first thing I noticed when looking at the code is that it has a great structure. Many AngularJS tutorials (or any other tutorials for MVC frameworks) and code-bases I've worked on have the messy structure of a 'models' directory, a 'controllers' directory, and a 'views' (or 'templates') directory.
AngularJS did the right thing by organising the code by features/components, and not by code-framework definitions. This makes it much easier to navigate through the code base, and to easily add more features.
Having a code base organised by controllers, models, views, etc, doesn't do much for your code base except become a pile of unrelated features in each directory, violating the Separation of Concerns principle.

(In the image you can see each component grouped in it's own directory, which includes it's templates, it's code and it's styles all together)

In addition, most AngularJS applications I've seen have all their routes defined in one file (usually app.js or index.js), which goes along with many global definitions, and sometimes logic related to specific pages or models all in a single file with no relation to any feature.
Kibana's code is nicely organised, and each 'plugin' or 'component' (discover/visualize/dashboard/settings/etc) defines it's own routes in it's own controller.
They manage to do this by creating their own 'RouteManager' ( This basically defines the same api as angular's RouteManager, but it collects the routes you define, and in the end calls angular's route manager to actually add them (by calling routes.config here :
This custom route manager also adds the ability to resolve certain things before the route is called, which is real useful in many situations.

Javascript Libraries

The creators of kibana did a great job (with a few minor exceptions that I will explain in the end) in choosing many open source javascript libraries to lean on while building kibana. It's usually a good idea to not reinvent the wheel, especially when someone already did a good job before you.

RequireJS is a javascript module loader. It helps you create modular javascript code, and makes it really easy dealing with dependencies between modules. Kibana's code does a great job utilizing RequireJS by defining most javascript modules in the AMD standard.

A really nice trick they did here that is definitely worth mentioning is the 'Private' service they created. This is a wrapper that allows you to define a RequireJS module, with angularJS dependencies. This allows you to use angular's dependency injection abilities side-by-side with RequireJS' DI abilities.

Regularly loading RequireJS modules in the code looks like this :

define(function(require) {
var myService = require('my_service');
// now do something with myService

Using the 'Private' service you load modules like this :

define(function(require) {
var myAngularService = Private(require('my_angular_service'));
// now you can use myAngularService

And most important is that my_angular_service looks like this :

define(function(require) {
return function($q, $location, $routeParams) {
// all angular providers in the function parameters are available here!

The Private service uses angular's get() method to retrieve the $injector provider, and uses it to inject the dependencies we need.
(Take a look at the 'Private' service code here :

If you're not familiar with lodash, you should be. It's the missing javascript utility library that will definitely help you DRY up your javascript code. It has many "LINQ"-like methods (for those familiar with C#), and many other basic methods you would usually write yourself to help iterate over json objects and arrays in javascript. One of the really nice features about lodash is that most methods you can chain to make your code more readable and lodash uses lazy evaluation so performance is amazing!

I don't want to start writing about the features of lodash, but I strongly suggest reading their docs, and getting familiar with it.
Almost every service, component or controller in the kibana code starts with this line :

var _ = require('lodash');

They also did a really good job extending lodash with some utility methods of their own. Take a look at these files to see for yourself :

(There's one thing I don't like here, which is the methods 'get' and 'setValue' - They do a 'deepGet' and 'deepSet' which is like saying "hey, i know i have something here in this object, but have no idea where it is". This just doesn't feel right... :/ )

Some HTML5

Throughout the code there has been some good use of html5 features.
The first one I noticed and really liked is the 'Notifier' service ( I really like the abstraction here over notifying the user of different message types, and the abstraction over the browser's 'console' methods. The 'lifecycle' method ( is really neat, and uses the method to group messages in the browser's console. It also uses '' which is really nice, and much better than using the older '' method (it's more exact, and it's relative to the navigationStart metric).

Kibana also makes use of the less-common <wbr/> tag. This is new to html5 and is intended to give you a little more control over where the line breaks when text overflows in it's container.

There's also use of 'localStorage' and 'sessionStorage' for saving many local view settings in the different kibana pages. In general, they did a great job in persisting the user's state on the client side. When navigating between tabs, it keeps you on the last view you were in when returning to the tab.

Another nice thing is that there is a lot of use with aria-* attributes, and recently I see more and more of this in the newer commits. It's nice to see a big open source project dedicating time to these kinds of details.

Object Oriented Programming

There is a great deal of attention to the design of objects in the code.
First, I like the way inheritance is implemented here. A simple lodash 'mixin' allows for object inheritance.

inherits: function (Sub, Super) {
Sub.prototype = Object.create(Super.prototype, {
constructor: {
value: Sub
superConstructor: Sub.Super = Super
return Sub;

Many objects in the code use this to inherit all the properties of some base object. Here's an example from the 'SearchSource' object :

return function SearchSourceFactory(Promise, Private) {
var _ = require('lodash');
var SourceAbstract = Private(require('components/courier/data_source/_abstract'));
var SearchRequest = Private(require('components/courier/fetch/request/search'));
var SegmentedRequest = Private(require('components/courier/fetch/request/segmented'));

function SearchSource(initialState) {, initialState);

// more SearchSource object methods



You can see the SearchSource object inherits all the base properties from the SourceAbstract object.

In addition, all the methods that would've been static are defined on the object prototype. This is great mainly for memory usage. Putting a method on the object's prototype makes sure there's only one instance of the method in memory.

Memory Usage

Since kibana is a big single-page application, there is a need to be careful with memory usage. Many apps like kibana can be left on in a browser for a long time without any refresh, so it's important to make sure there are no memory leaks. AngularJS makes this easy to implement, but many programmers don't bother going the extra mile for this.
In the kibana code, many directives subscribe to the '$destroy' event and unbind event handlers not to hold references to unused objects.

An example from a piece of kibana code (the css_truncate directive) :

$scope.$on('$destroy', function () {

Code Conventions

Kibana's code is mostly very organized, and more importantly readable. A small negative point goes here for some inconsistencies with variable naming. There are classes that have public methods that start with '_' and some don't.

For an example of this, look at the DocSource object. This file has even commented 'Public API' and 'Private API' but the naming convention differences between the two aren't clear.

Code Comments

I can say the code has enough comments, but I have no idea how much that actually is, since most of the code is readable without comments, which is an amazing thing. There are great comments in most places that should have them.

Just a funny anecdote is that I was surprised to see comments that actually draw in ascii art the function they describe! Kudos!

* Create an exponential sequence of numbers.
* Creates a curve resembling:
* ;
* /
* /
* .-'
* _.-"
* _.-'"
* _,.-'"
* _,..-'"
* _,..-'""
* _,..-'""
* ____,..--'""
* @param {number} min - the min value to produce
* @param {number} max - the max value to produce
* @param {number} length - the number of values to produce
* @return {number[]} - an array containing the sequence
createEaseIn: _.partialRight(create, function (i, length) {
// generates numbers from 1 to +Infinity
return i * Math.pow(i, 1.1111);


CSS Styling

Another great success here was using the 'less' format for css files. This allows for small and concise 'less' files, and reuse of css components easily (known as 'mixins'). There has been a great job here done with colors especially - All colors are defined in a single file ( Editing this file, you can easily create your own color scheme.

(There are a few exceptions - mainly a few colors defined in js files or css files, but It's 99% covered in _variables.less).

Build Process

Kibana has a grunt build process setup. It compiles the css files, combines them and js files (without minifying, using r.js), adds parameters to the resource files for cache-busting, and some more small tasks.
I would be happy to see this upgraded to using gulp, which is stream based and has a much nicer api (in my opinion), but grunt still does the job.


After writing so many good points about kibana's source code, this is where I lack good feedback.
Maybe it's because when building kibana they had in mind that it's not to be served over the internet, and it's just an internal tool, and maybe it's just because I'm overly sensitive after working for quite a while on the performance team at Sears Israel (working on Either way, if it was an online website, it's performance would be considered under-par.

JS files aren't minified. They are combined, but not minified. Unfortunately, the code isn't even prepared to just minify the files. In order to do this, angularjs dependencies need to be defined with the dependencies declared as strings before the function itself. Otherwise angularjs's dependency injection mechanism won't work.

CSS files aren't minified either, just combined.

JS files are ~5MB !!! Yes, almost 5MB!! That's huge, and it's all downloaded on kibana's initial load. This could've been done in a few separate files, downloading only the ones needed for the initial view first. This would already be a great improvement. Though there are advantages to not minifying the js, and I think that's what the creators had in mind - It's easier to debug with DevTools (no need for mapping files), and although initial load will take a long time, after that there is no wait on any other pages. If the resources are cached on your machine, then even getting back to kibana the second time should be really fast.

There are also some libraries in the source code which I think are redundant and maybe could've been removed with a little extra work. One example is jquery, which I think is frowned upon using with angularjs. AngularJS comes with jqlite, which is a smaller version of jquery and should suffice.

I hope it doesn't sound like I think they did a bad job - I'm pointing out some areas in the code that maybe could've been done differently. All in all the app is amazing, and works great! :)

In conclusion

I had a great time learning and working (and still working) on kibana's code. I tried to show a lot of good things I like about the code, and point out a few minor bad things in the code. I hope you enjoyed reading this, and Kudos to you if you got to this point! :)

I also hope to write another post about how kibana communicates with elasticsearch and maybe another one on how it renders the visualizations with the help of D3.js

          Holiday art encounters - 2        
This encounter was extraordinary: it was an experience which lifted my spirits in a way which has perhaps not occurred since way back in the mists of time.  We visited the arts centre in Lochmaddy, North Uist to have lunch on our tour of that island, and to see what art there was on exhibit.  Upstairs there was a room with quotes from a logbook written on the walls, the log books of Roberta Sinclair, naturalist and submariner. Stationed on Berneray after the second world war, she was a keen sea swimmer and regularly explored the waters around the island, gaining the nickname An Giomach (The Lobster).

and a mobile with small cable cars filled the upper volume of the space.  The exhibition's title is The Lobster and the Lacuna

The downstairs gallery contained a full size cable car, with projected waves on the walls around. the mid 1950s, the system expanded into the sound of Harris, when an unsuccessful attempt was made to create a new shipping hub on the east coast of the tiny island of Hermetray. Backed by investment from the then owners of Amhuinnsuidhe Castle, a small team built a prototype car which travelled from Berneray along the Grey Horse Channel to Hermetray, spending much of the route underwater to avoid crosswinds and interference with shipping.
Ms. Sinclair helped modify HCTC gondola No. 72 into an amphibious vehicle and was the only passenger on the prototype’s single voyage. Thinking both of a species of intertidal sea snail (Lacuna Vincta) and also of the silent unknowns of the world beneath the waves, she referred to her adapted cable car as The Lacuna.
So many exhibitions these days are based on or derived from history - events, people, social developments, etc. that this appeared to be another such - beautifully designed and engaging, ... amusing and largely quotidian quotes from the logbook, and the 50s wooden cable car there downstairs.

WHAT!?  A cable car transit across a stretch of sea constantly buffeted by strong winds, gales...?  And which also journeyed part of the route under water...?  A beautifully constructed cable car, but nonetheless a wooden cable car which resembled a cross between a shed and a beach hut-?

I must admit that my initial response was that it was all real, and that it had just been some bonkers idea from the folks in charge, but then doubts (which came much more rapidly to my husband) crept in.
What a glorious wheeze.  What a wondrous conceit so excellently, meticulously executed.  Brilliant story and accompanying detail and design.  The Hebridean Cable Transit Company  is - the artists - are Philippa C Thomas and Hector MacInnes.  The exhibition had been shown in Stornoway previously, and a blog noting its journey is here, whence came the images above (part of its title is Suspension and Disbelief).

Here are some snaps I took of the cable submersible:

Two other spoofs which I have encountered in my lifetime are similar: the spaghetti harvest film presented by the BBC (see it here), and the supplement on San Serriffe in the Guardian newspaper  (see it here).

          Analysts: Don’t Get Too Excited About Potential Dish/Amazon Partnership Yet        
Analysts: Don’t Get Too Excited About Potential Dish/Amazon Partnership Yet

Last week, there was talk that Dish Network and Amazon were engaging in discussions for a potential partnership, a move that just might begin to challenge the dominance of the Big Four carriers in the United States (namely Verizon Wireless, AT&T, T-Mobile, and Sprint) as well as those cable companies who are already entering into the wireless industry. According to speculation, it is quite possible that Dish could take capex from Amazon in order to start setting up its network (which is said to focus on the Internet of Things), or the two business entities could enter into an agreement that sees the satellite TV provider provide wireless services to subscribers of Amazon’s Prime service (for an extra charge, that is).

Because Dish Network happens to possess a considerable portfolio of spectrum assets, some have begun to believe that the satellite TV giant is eyeing to join forces with a mobile service provider, or at least a company with plans of becoming one. Indeed, apart from its rich cache of midband spectrum, Dish also spent $6.2 billion in acquiring 486 licenses during the recent incentive auction of 600 MegaHertz airwaves. Many industry watchers believe, however, that there is pressure on Dish Network to start making full use of its spectrum, mainly due to buildout mandates from the Federal Communications Commission (FCC). True enough, back in March early this year, the company had revealed that it was planning to create a narrow band Internet of  Things (NB-IoT) network. As for Amazon, while it has experimented with wireless projects before, it has not made a big move to enter into the wireless business yet. But recent efforts do suggest that it might be getting ready -- as a matter of fact, this year saw the e-commerce giant submit an application to the FCC for Special Temporary Authority status, in order to be able to start initiating trials using prototype equipment and spectrum frequencies, including parts of the 700, 800, and 1900 MegaHertz bands.

So, is it likely for both companies to partner soon? According to Kannan Venkateshwar, an analyst from Barclays, a direct acquisition (with Amazon as the buyer) would not be likely. Venkateshwar further pointed out if Amazon made a move to acquire Dish, it would now attract attention from the FCC.

Still, Dish Network remains an attractive target. Some analysts say that AT&T would be a fitting buyer, because it could readily integrate Dish into its existing DirecTV brand. And other cable companies would really love to get their hands on Dish’s spectrum.

Page Type: 
Post Type: 
Operating System: 

          iPhone 4 + Personal Hotspot + Wi-Fi-only iPad: Possible, but with drawbacks        

Once Apple announced Personal Hotspot, the new iPhone 4-only feature, a lot of potential iPad buyers started asking the same question. "Can I use this feature with a Wi-Fi-only iPad and avoid paying extra for a 3G-enabled iPad, plus another monthly data plan for it?"

Indeed you can. With Personal Hotspot activated on an iPhone 4, any Mac or iOS device will treat the iPhone 4's Wi-Fi broadcast like it's a standalone base station. This means if you have an iPhone 4 and a Wi-Fi-only iPad, you can "tether" your iPad to your iPhone's 3G data connection for the first time.

"Awesome! So this means if I have an iPhone 4, there's no reason to get an iPad with built-in 3G, right?" Well, no, that's not necessarily true. I can think of three things you lose if you go Wi-Fi-only with your iPad and keep it tethered to your iPhone 4.

1. GPS. Only the 3G models of iPad have built-in GPS functionality. The Wi-Fi models can approximate your position using Wi-Fi, but it's almost never as accurate as with GPS. "Well, so what," you might say. "If I've got my iPhone right there, what do I need GPS on my iPad for?" That depends on how important GPS functions are to you. If you hardly ever use apps that depend on location-based services, you probably won't be missing out on much. If you're like me and you use location-based apps all the time, having to sacrifice GPS functionality on one of your iOS devices might be more trouble than it's worth.

2. Longevity, by which I mean the amount of time you can use the iPad in a single session. The 3G version of the iPad 2 is rated for nine hours of battery life when surfing over 3G. When using your iPhone 4 as a Personal Hotspot, you can expect the iPhone 4's battery to last for only about five hours before it needs to be charged. Granted, you can bring along the iPhone 4's charger, plug it in, and use Personal Hotspot as long as you like. However, the charger and cable are just two more things to carry, finding an unused outlet isn't always easy when you're on the go, and having your iPhone plugged into the wall quite literally tethers you to one spot. That leads into the third thing you give up if you go the Personal Hotspot + Wi-Fi iPad route...

3. Flexibility. If your iPad doesn't have its own 3G capability, it's totally dependent on your iPhone's Personal Hotspot unless you can find another Wi-Fi source. If your iPhone's battery dies, or if you forget your iPhone in a bar and some unscrupulous wag pockets it, your iPad loses all of the versatility it gained through Personal Hotspot.

The iPhone's data plans aren't anywhere near as flexible as those on the iPad, either. For one thing, in most countries the iPhone is locked to whatever carrier you buy it from; the iPad has no carrier locks whatsoever, and you can roam between carriers (or between countries) at a whim. Not only that, in several countries (most notably the US) you'll pay an extra monthly fee to enable Personal Hotspot on your iPhone 4. In the States this comes to $20 per month, which gives you an extra 2 GB of monthly data, for a total of 4 GB per month on your iPhone's plan.

On a US iPad plan, you'll get 2 GB of data for $25. That's $5 per month more expensive than enabling Personal Hotspot on the iPhone, but you can manage the iPad's data plan on a month-to-month basis -- no contracts to sign, and no obligations to any carrier. Most carriers also offer cheaper iPad plans with lower monthly bandwidth caps, which should satisfy most users' data needs.

Personally, I'm still getting a 3G-enabled iPad 2. I may never actually use its independent 3G capabilities since the iPhone 4's Personal Hotspot costs nothing extra through my wireless provider, but I'd rather know that I could use the iPad's own 3G if I needed to.

Update: Many commenters have pointed out that Apple's Canadian website contains the following verbiage on the iPad 2's 3G capability, which at first glance seems to indicate the iPad 2 may be carrier-locked in Canada:

If you decide on an iPad with Wi-Fi + 3G, be sure to select the model that corresponds with the carrier you'd like to use for 3G service. The iPad model you purchase is specially configured to work with either Bell, Rogers, or Telus. So while you don't have to activate 3G service right away, you should choose your iPad with Wi-Fi + 3G according to the carrier you prefer.

Website iPhoneinCanada has confirmed directly with Rogers itself that the iPad 2 will not be carrier-locked in Canada. And despite similar wording on the US Apple Store urging buyers to decide between an AT&T iPad or a Verizon model before purchasing, the situation for the iPad 2 in the US remains the same: the AT&T iPad 2 is not locked to AT&T. iPhoneinCanada verified this by calling Apple directly; I just got off the phone with AppleCare myself, and they confirmed that just like the original iPad, the iPad 2 will not be locked to any specific carrier. Therefore, if you're like me and you live in a country where the iPad 2 won't be released until after March 11, you can still order an AT&T model iPad 2 from the US site without fear of having to jailbreak the thing in order to use it in your home country.

The AT&T versus Verizon iPad 2 situation is a matter of the hardware differences necessary to access the different networks, not a case of the iPad being artificially locked to one carrier or another. It's unclear why Apple chose to word things the way it did on its Canadian site (no other country's site contains similar wording), but the Canadian carriers themselves have stated the iPad 2 won't be carrier-locked.

          EXPTA Gen5 Windows 2012 R2 Hyper-V Server for Around $1,000 USD – Parts Lists and Videos!        
I’m very pleased to announce the release of my 5th generation Windows Server 2012 R2 Hyper-V lab server, the Gen5! You can use this home server to create your own private cloud, prototype design solutions, test new software, and run your own network like I do. Nothing provides a better learning tool than hands-on experience! […]
          USMC Advanced New Firefighting Vehicle From Oshkosh Featured at Marine South        
OSHKOSH, Wis. -  Oshkosh Defense, a division of Oshkosh Corporation (NYSE: OSK), is displaying the U.S. Marine Corps’ new firefighting vehicle, the Oshkosh P-19 Replacement Aircraft Rescue and Fire Fighting (ARFF), this week at Marine South, April 9-10 at Marine Corps Base Camp Lejeune, N.C. Three Oshkosh P-19R prototype vehicles currently are undergoing testing with the Marine Corps. Once fielded, the P-19R will replace the Marine Corps’ P-19A fleet, which was first fielded in 1984 and is rea...
          USMC Begins Testing Oshkosh Defense P-19R        
OSHKOSH, Wis. -  Oshkosh Defense, a division of Oshkosh Corporation (NYSE: OSK), has delivered three Oshkosh P-19 Replacement Aircraft Rescue and Fire Fighting (ARFF) vehicle prototypes to the U.S. Marine Corps to undergo testing. The Marine Corps selected the Oshkosh P-19R in May 2013 to serve as the Marines’ off-road firefighting vehicle of the future. Oshkosh will display the P-19R at Marine West, Jan. 29-30, at Marine Corps Base Camp Pendleton, Calif. “We delivered three P-19R prototypes i...
          Cell Phone Project        

In the last few weeks I have been working with other interns to create a cell phone holder for a disabled lady who cant move her hands verry well. We have worked on many differnet prototypes but came out with one that we thought would work best for hr needs.  When we started out we thought out a way to use an angle for the holder to be facing the lady instead of being flat on the bar that it was going to be placed in.We created the cell phone holder with acrylyc abd cur out the pieces with a laser cutter that we have here at the lab.

read more

          Yamato's Die-cast Shield Liger Updates        
Not much news on the last day of the month it seems. ^^;

And not much news regarding Yamato's non-scale die-cast full action model Shield Liger since its official announcement back in mid August. Here's a little bit of updates on its prototype design.

A large volume of die-cast material is used for the model. All the silver parts on the leg as shown in the first image are made of die-cast material. From the initial announcement, we know that 80 out of 200 parts of the figure are die-cast. It's reported that the entire figure will weight 1.8kgs.

The blue portion is either ABS or POM.

The silver fins on either side of the head as shown in the second image are made of die-cast material as well.

December release, 18,690 Yen (inclusive of tax).

Images are from Yamato's Development Blog.
          Limited Soul Spec Dragonar 2 & 3 New Images        
Not sure why, but Tamashii Web has decided to post up images of the prototype of Tamashii Web Shop's limited Soul Spec Dragonar 2 and 3 (instead of finished painted figures) as the latest coverage on them.

* Soul Spec XS-14 Dragonar 2 - Pre-order has started since July 2nd, December release, 7,875 Yen (inclusive of tax).

* Soul Spec XS-15 Dragonar 3 - Pre-order has started since July 2nd, December release, 7,350 Yen (inclusive of tax).

Pre-order will end today for these 2 limited items.

Images are from Tamashii Web Diary.
          Robot Damashii Destiny Gundam January 2011 Release & Others        
EDITED: Added more items and images.

Finally, back to my computer. New items from Tamashii Nation for early next year release. No official announcement on hobby websites yet.

* Robot Damashii [Side MS] Destiny Gundam - January release, price TBA.

Image of a prototype model on display at Tamashii Feature Vol. 1 Osaka stop held in early February, from this previous posting.

* Garo Kiwami Damashii Dan the Knight of the Midnight Sun - January release, price TBA.

* S.H. Figuarts Cure Blossom - January release, 3,500 Yen (exclusive of tax).

* S.H. Figuarts Cure Marine - January release, 3,500 Yen (exclusive of tax).

S.H. Figuarts Cure Blossom (left), Cure Marine (right).

* S.H. Figuarts Kazuma - From "s-CRY-ed". January release, 3,300 Yen (exclusive of tax).

Kazuma ranked No. 6 in the "S.H. Figuarts Origin" category in Tamashii Nation's Tamashii Feature Vol. 1 Desired Character Nomination Campaign that was done in January and February this year.

* Robot Damashii [Side ?] L-Gaim Mk. II - January release, price TBA.

Robot Damashii [Side ?] L-Gaim Mk. II ranked No. 1 in the Robot Damashii catagory in the special campaign mentioned above. The release has been confirmed since March this year.

* Ultra Act Kaiser Belial - January release, price TBA.

Image is from ToyWorld Forum.

* Ultra Act Ultimate Zero - January release, price TBA.

* Robot Damashii [Side ?] L-Gaim Mk. I - February release, price TBA.

* S.H. Figuarts Kamen Rider 000 Tatoba Combo - Revealed a week ago. February 2011 release, 3,000 Yen (exclusive of tax).

Image is from Muso's Photo Album.

Information and images are from CyberGundam.
          Second Life OS        
So the prof has us brainstorm in class about a "new and different" OS metaphor. I take the bait and throw out Second Life OS. Then he tells us we have to prototype it by Tuesday! Why oh why didn’t … Continue reading
          Work in progress        
I've been busy on the laptop ordering supplies for a new range of craft kits. I'm really excited about these ones, and can't wait to share them with you. Here's a little sneaky peek, but you're going to
have to wait for the launch for the full show and tell :-)

          Busy, busy, busy...        
Oh my goodness, it's been a while since I posted on here. Over summer I have been busy working on some new kit designs in readiness for Christmas. I finally finished the instructions last week so they are now good to go and have already gone out to some of the shops I supply. I'm just waiting for a break in the rain so I can get some decent photographs taken then they will be going in the Not On The High Street store as well.

Yesterday I had a idea for a new range of kits, I always get really excited when I have new ideas so I'm going to be working on some prototypes over the next few weeks. They are a bit of a departure from the kits in tins, but they should sit nicely along side my current range.

I'm also working on the design for a counter top box to hold my kits, which I will send out with all wholesale orders of 100 kits or more. So you can see it's all go here in Harrogate and with the festive season looming up it's going to get even busier.

One of my best seller brooches at this time of year are the robin brooches. I'm not making any brooches to sell online myself as I just don't have the time, but the charity Leukaemia & Lymphoma Research are selling them along with my dog kits, if you're interested they are available on their online shop,

          Information Architecture        
Two new links: "What is Information Architecture" and "Does the Fidelity of a Prototype Affect Results".
          Thermodynamics applied to highrise and mixed use prototypes        
Iñaki Ábalos y Daniel Ibáñez son los editores de este informe de investigación de la Harvard Graduate School of Design sobre la termodinámica aplicada a los edificios en altura y a los prototipos de uso mixto, como su propio título indica. Forma parte de una investigación más amplia que el profesor Ábalos ha venido desarrollando […]
          Freemason Bro. Steve Wozniak Considers Return to Apple?

Steve Wozniak Considers Return to Apple By Kendra Srivastava | Tue Apr 12, 2011

Apple co-founder Steve Wozniak reportedly said he might return to the company if asked, giving shareholders plenty to ponder regarding the uncertain future of its leadership.

Wozniak, or “Woz,” still holds Apple stock and even remains a paid, if nominal, employee; he also maintains relations with Apple president Steve Jobs. But Woz differs significantly from Jobs in that he favors more customizable computers than Apple currently offers.

“My thinking is that Apple could be more open and not lose sales,” he said to Reuters. Given Apple’s current circumstances, Woz’s opinion isn’t just academic.

Steve Jobs is now in his third leave of absence, having suffered through pancreatic cancer, a liver transplant, and recently an undisclosed medical condition.

Jobs’ potentially indefinite recuperation recently prompted nervous shareholders to outline a CEO succession plan. But it was shot down in a February 23 vote, leaving people to wonder what will happen should Jobs be unable to return.

Enter “The Woz,” an enigmatic genius who partnered with Jobs and others to create Apple in 1976. Woz assembled the prototype for the [...]

          EyePhones Will Replace iPhones        

I presented the following prediction as part of a spirited Churchill Club debate with 5 other VCs. It was first published as text in AllThingsD.

Remember MS-DOS commands, and the WordStar keystroke combinations we had to memorize? Then the first Macintosh featured a mouse driven GUI that was game changing because it removed a layer of friction for both the data going in and coming out. When we tried that first model, we knew we could never go back to a C prompt.

And yet the impact of graphical computing was minor compared to how facial computing will change our lives, and how we all relate to The Collective. Think of it as a man-in-the-middle attack on our senses, intercepting all the signals we see and hear, and enhancing them before they reach our brains.

First Generation Mobile Computer
This is not science fiction, and based on prototypes I’ve seen, it’s a good bet that design teams in Google, Apple, Samsung and various military contractors are building eyewear computers that will render smartphones as obsolete as the first generation of mobile computer. I’m not talking about Google Glass, with its cute little screen in the corner. I mean an immersive experience that processes what we see, and then overlays graphical objects onto our field of view: true Terminator Vision. The US military has this capability today, so that troops can see pointers to their platoon members, and markers of known IED locations. So now it’s just a question of making the hardware small, cheap, and available in four adorable colors.

Not only will our favorite apps on eyewear computers be more immediate and engaging, but we’ll experience new computing capabilities so compelling that we find them indispensible. For example, eyewear computers can record our lives, and enable us to summon any relevant conversation or incident from our past. With eyewear computers, we can truly share experiences in real time, transporting ourselves to the perspective of someone on a ski slope, or in a night club, Wimbledon match, or the International Space Station. 

Just as Terminator did in the movie, we will air-click on actual things we see to interact with, investigate, or purchase. We’ll integrate facial recognition and CRM for background data on everyone we meet. When we travel abroad, signs will appear to us in English, and when someone is speaking to us, we can simply turn on English subtitles.

 A new generation of games will be more immersive and engaging than ever before.

Five years from today, when smartphone sales are in decline, we will ask ourselves: Remember when we used to spend our days looking down at those little screens?

          Around the shop with Strobel Travel Guitars – April 2016        

Testing out low cost travel guitar prototype (wk14).  Need to make a template for Sunpost Neck pocket (to fit Sunpost necks).  Cut a Sunpost body in half to make a top and bottom template.  Low Cost bridge is ¼” closer […]

The post Around the shop with Strobel Travel Guitars – April 2016 appeared first on Strobel Travel Guitars.

          Reply #16        
legitimate programmers selling systems beta test them with members on this site because they know that people here are knowledgeable, honest, and serious about the lotterymajor corporations pay big bucks to test programmers can do it for free, and people post the results of their picks or their systems for free, for anyone to trylottocheetah, and winhunter, lottery director, sedertree--all have free trial versions to try before you buy and people who like them, post why, people... [ More ]
          SVG oder Canvas das ist hier die Frage!        
Je mehr ich mich mit dem Thema "Zeichnen im Browser" beschäftige, umso mehr stellt sich die Frage, welche Technologie ich denn nun nutzen soll. SVG und das Canvas-Element haben beide ihre Stärken. Man kann mit JavaScript sowohl eine SVG-Grafik als auch Canvas-Elemente manipulieren was bei meinem Projekt unverzichtbar ist. Aber es kommt mir so vor, dass man mit dem Canvas-Element doch etwas einfacher arbeiten kann. Im Netz findet man einige Anwendungen die vom Canvas-Element Gebrauch machen: Und da diese Anwendungen in einem gewissen Sinne mit meiner zu vergleichen sind, hab ich mal einen kleinen Prototypen mit dem Canvas-Element gebaut - ein sehr sehr gutes Tutorial gibt es übrigens hier - : Man kann zwei Klassen anlegen, diese auch verschieben und eine Assoziation zwischen diesen Klassen erstellen. Der blaue Container - ein div-Element - enthält das Canvas-Element. So kann man nun dank JS auf die Assoziationen zugreifen. Die Verbindung ist aber noch nicht an die beiden Klassen gekoppelt. Was bei mir als nächstes auf der Liste steht ist:
  • welche Rechtecke sind miteinander verbunden und auf welche Art
  • beliebig viele Rechtecke miteinander verbinden
  • Verweis auf sich selbst
Die html-Seite könnt ihr hier bekommen (nur im Firefox > 2.0 getestet): Download file Jetzt auch mit allen libs. :-)! Gibt es irgendwelche Einwände das mit dem Canvas-Element zu machen oder sonstige Anmerkungen?
          Prototype Cheat Sheet        
Auf der Seite von bekommt man ein Prototype Cheat Sheet . Denke das es in der Arbeit mit Prototype ganz hilfreich sein kann. Die Projekte auf der Seite sind auch einen klick wert. Zu erwähnen sind hier vorallem Prototype UI und Scripteka.
          The Awakening / The Struggle (The Vampire Diaries, #1-2)        
The Awakening / The Struggle (The Vampire Diaries, #1-2)
author: L.J. Smith
name: Kati
average rating: 3.69
book published: 2007
rating: 3
read at: 2009/03/07
date added: 2017/03/14
shelves: 2009, in-english, straight, romance-para, tvd-to-tsc
This omnibus carries the first two books of Smith's tetralogy - "The Awakening" and "The Struggle". I would give the first one two stars, the second one four. The first one was namely too much of a sugary teenage romance - I'm probably too old to read young adult stuff anymore.

There were some things that really annoyed me in these books: Elena was perfection incarnated, the prototype of a Mary-Sue, she had everybody wrapped around her little finger, she always got what she wanted (be it the title of the Homecoming Queen or every boy in school) and when she didn't, she let the waterworks flow, adopting the "you hurt me so" expression. And of course, when two half a century old vampires arrived in town, she was the love of their life at first sight, they wanted to spend their eternity with her and only with her, no other woman had ever mattered *snorts*

But the story did get better - once the author stopped concentrating on the Homecoming Queen's dating plan and remembered that she wasn't writing an after-school special but a romantic horror story. Because the horror part was really well written. It was even rather scary in some places. I suspect that the next two books in the tetralogy will be better.

And yes, this is the series that the CW's adapting for TV. I think it'll fit their programing plan perfectly - it's Gossip Girl with vampires, after all...

          Android rilis Android ICS 4.0 untuk prosesor Intel dan AMD x86        
Ada berita baik untuk pecinta Android, karena diberitakan bahwa Tim Google Android Development akhirnya merilis versi awal Android Ice Cream Sandwich (4.0.1) untuk chipset/prosesor Intel dan AMD dengan arsitektur x86.

Dukungan OS Android untuk prosesor Intel dan AMD yang memakai arsitektur x86, sesuai dengan janji dari Andy Rubin ketika berbicara di Intel Developer Forum bahwa nantinya semua versi Android mendatang akan dioptimasi untuk chip x86 buatan Intel/AMD.

Saat itu Rubin mendemokan prototype tablet dan smartphone Android berbasis prosesor Intel Medfiel (Generasi ke-4 dari prosesor Atom 32-nanometer)

Rilis awal dari Android x86 belum semuanya stabil karena ada beberapa fitur utama yang belum berfungsi dan masih dalam pengembangan seperti sound, kamera, ethernet, dan hardware acceleration untuk chipset Intel. Sementara untuk yang berfungsi dengan baik adalah fitur-fitur seperti Wi-fi, multitouch, OpenGL hardware acceleration untuk chipset AMD Radeon.

sumber :
I am working on two week prototype jam at work and this is the first concept for team Steed.
          Rooftop Concentrating Photovoltaics Win Big Over Silicon In Outdoor Testing        

A concentrating photovoltaic system with embedded microtracking can produce over 50 percent more energy per day than standard silicon solar cells in a head-to-head competition, according to a team of engineers who field tested a prototype unit over two sunny days last fall.

          The Bigmantoys CHEAP CREEP CLUB!        

          Shapeways store is open! Puppet MUSCLE exclusive variants available NOW!        

After handling the first batch of Fake Baron and Puppet MUSCLE test-shot 3D prints from Shapeways, I realised that even the more affordable plastic is not only really durable, but also provides good detail. Not as super sharp as the prototypes I use for casting rubber minis, but if the sculpt was is given deeper etching and less bitty details like shoelaces, and small locks of hair, they come out great.

Action-pose Leech-woman and Tunneler are available right now!


So here's a chance to offer collectors (including myself!) the best of both worlds - I'll still be producing MUSCLE-style rubber keshi with super high details on the sculpts, but there will also be the option to go to my Shapeways store and pick up variants and exclusive figures that have been modified to look great in lower resolution flexible plastic
Details like the protruding leech which are almost impossible to accomplish with a two part mold.


As a collector myself of most minifigure lines, I like the classic pose of MUSCLE figures and the durability of rubber (and the fact I can send rubber toys overseas at little cost to the customer), looking at figures like Brownoise's Ashurashine, there's no getting around the fact that the lack of need to make a mold means the structure and pose of the figures is limitless. Some people just prefer MIMP and Gormiti with all their variety, and dynamic poses, which are often tough or impossible to replicate with a silicone mold. Also, the colours on these bad-boys are gorgeous.
Cyber-pinhead is available now!


In case you're not too familiar with this, my 3D models are uploaded to Shapeways. They will create the figures to order for the customer, and the price is based on their work, material volume, and a tiny mark-up on my end. Shapeways will make the figure, and deliver to you either from North America or Europe, depending on where you are. Orders normally take about 10 days to make and ship via UPS. It works the same way as any online store :)
I was expecting these to be brittle and feel cheap. I've dropped them a ton of times, flexed the limbs a little, and can tell you these are sturdy! Definitely more toy than model!

... All the pics in this blog are the actual products you can get from Shapeways. As I shift more and more to digital sculpting, the exclusives in this store will expand and I'll try as always to find ways to keep them as affordable as possible.

          UnMouse cheap multitouch prototype        

A Microsoft research team has delivered a prototype called the UnMouse that could really be a big hit. This unit is a mouse pad sized sensor that is multitouch and pressure sensitive. It is flexible and thin enough to roll up. The article mentions that the construction of the device is “dirt cheap”. This is very exciting; is this the next mouse?

The idea of having low cost multitouch input is very enticing. While there are many ways to do multitouch right now, most are limited by their large size due to projector/camera setups or high cost such as the …read more

          Apahouser Earns ARM 9009:2013 Quality Certification with A+ Rating        

Apahouser, Inc. a world class metal manufacturing company achieves the distinguished ARM 9009 Quality Firearms & Armoring Certification

Marlboro, MA -- (ReleaseWire) -- 11/05/2014 -- G-PMC Registrars, LLC, one of the country's leading ISO certification bodies (CBs), announced today it has approved Massachusetts-based Apahouser for ARM 9009:2013 quality certification. Apahouser, a metal manufacturer specializing in precision sheetmetal, fabrication, stamping and machining passed all quality and system requirements mandatory for ARM 9009 accreditation, which is the world's leading system specific standard for the firearms and armoring industries (

According to Jack Oliver, Director of Certification for G-PMC Registrars, Apahouser is the first metal manufacturing company in Massachusetts to achieve ARM 9009 certification, earning the highest possible quality rating. "As an ISO 9001 certified company, Apahouser has already proven itself to be a world class metal manufacturing company, and by achieving ARM 9009 certification, Apahouser is now recognized worldwide as a quality approved vendor for manufacturing parts and components for firearms, body armor, and armoring applications."

ARM 9009 system accreditation is based on highly specialized standards that have been universally accepted as a mandatory requirement for suppliers, subcontractors and manufacturers of all firearms and armoring type products, materials, hardware, adhesives, fasteners, metals, plastics, glass, composites, ceramics as well as related processes used in conjunction for the manufacture of firearms, armored vehicles, and body armor systems.

G-PMC Registrars ( is accredited by the American Board of Accredited Certifications (ABAC), the leading accreditation body of the United States, and is internationally recognized by the Global Quality Assurance Council. G-PMC is among a distinguished group of currently 9 certification bodies in North America authorized to issue ARM 9009:2013 certificates.

About Apahouser, Inc.
Established in 1935, Apahouser ( has evolved over the years into a world class metal manufacturing company. The company is a leader is providing precision sheetmetal, fabrication, stamping and machining services in prototypes to high volume. Apahouser provides responsive, low cost, quality metal fabricated products to lighting, bio-tech, medical, telecommunications, computer, aerospace, defense, and electronic allied industries.

40 Hayes Memorial Dr
Marlborough, MA 01752

For more information on this press release visit:

Media Relations Contact

Richard Milton
Email: Click to Email Richard Milton

          The New and Forgotten 360 Revolvers from Smith & Wesson         

Smith & Wesson announced the company is now shipping the new Model 360 revolver. This new J-frame wheel gun is a bit unusual in that it, as a “plain” Model 360, joins the Model 360 PD that is already in the company’s catalog.

The New 360

The just released revolver is a 5-shot handgun that is designed for the .357 Magnum cartridge. As with most .357 Magnum handguns, this revolver will also shoot .38 Special loads including the +P variety.

It is built on the company’s J-frame. This is the current small frame used for the more diminutive Smith & Wesson wheelies. Unlike many of the J-frames built of aluminum, this one uses a scandium alloy.

The unfluted cylinder of the Model 360.

For the non-chemistry crowd, scandium is a naturally occurring element, not a marketing name. While offering a lighter weight frame at a similar strength level when compared to a typical aluminum alloy, scandium alloy tends to be a bit more expensive. The simple reason is scandium is not as common an element as iron, aluminum and some other metals. In fact, it is often extracted from deposits of uranium and rare earth ores.

The use of scandium, therefore, drives the price up and the weight down. With an unfluted stainless steel cylinder and stainless steel barrel, the gun weighs less than 15 ounces unloaded. The suggested retail price is $770. Compared to the all stainless Model 60, this gun is about 6.5 ounces lighter for only $41 extra. Not a bad trade off in my book.

It’s been my experience that Magnum loads really kick with featherweight revolvers like this one. I am not suggesting that this gun is going to be uncontrollable. Lightweight Magnum revolvers are quite controllable when shot with good technique. This is not a beginner gun, though. Don’t buy one for your spouse or significant other unless they have requested it.

The left side of the Smith & Wesson Model 360.

Smith & Wesson uses a black PVD finish on the gun. PVD, or particle vapor deposition, is a metal finish that has been growing in popularity in the gun industry. It allows a company to apply a wide range of colors to a firearm while also providing good durability and corrosion resistance. In the past, S&W has not used PVD extensively. I wonder if we will see more PVD finishes from it in the future.

Offsetting the black finish is a set of two-piece rubber grips that are flat dark earth in color. The grips used on this gun are longer than the standard boot grip used on many of the small revolvers from Smith & Wesson. For most shooters, this means you can get a full firing grip on the gun.

The short barrel and front sight of the M360.

This model has a red insert in the front sight for improved visibility when compared to the plain ramp found on many J-frame guns. The rear sight is a machined notch in the frame at the rear of the top strap. 

Unfortunately, the Model 360 has the internal lock that Smith & Wesson installs on most of its modern revolvers. While the problems with these locks have been largely overblown, all of the problems I have read about have come from the lightweight guns. Hopefully, the problems have been completely resolved and will not create any issues in this gun.

The “Other” 360

Still in the company’s catalog is the Model 360 PD. The gun is very similar to the original Model 360 (also referred to as the Model 360 SC) that was introduced at the 2001 SHOT Show. That original model was discontinued leaving only the 360 PD as a current production model.

The S&W M360 PD.

Both the original 360 and 360 PD also use a scandium alloy frame, but with a fluted titanium cylinder. In the 360 PD, this reduces the overall weight to only 11.4 ounces. The reduced weight is great for carrying, but does create an issue for reliability.

With such a lightweight revolver, recoil is increased. Due to the increased recoil, light bullets can actually “jump” their crimp and “walk” out of the case. If a bullet moves very far forward, it can lock the cylinder up and prevent the gun from firing. For this reason, Smith & Wesson states only loads with bullet weights of 120 grains and more be shot from the 360 PD. The same warning was given for the original Model 360. I would also add that if you are shooting handloads to make sure you are crimping correctly.

Bullet walking appears to have been eliminated in the newest Model 360 as the warning is not given on it. I suspect the few extra ounces of weight is enough to prevent this condition.

The S&W M360 PD.

Increased recoil also translates to the shooter. I’ve shot one of these with full power 125 grain .357 Magnum loads, and it does bark. However, it is completely controllable and accurate. It would overwhelm a new shooter, so please don’t hand one of these to a novice. It could turn them off to shooting completely.

Another difference in the PD model is that it uses a Hi-Viz red fiber optic front sight for even better visibility. However, this is counteracted by a reduction in shootability because of the use of stubby boot grips.

The final major difference between the two guns is that the PD model is significantly more expensive: $1,019. The titanium cylinder is largely responsible for that price increase.

The Forgotten 360s: M&P360 and Kit Gun 

There are a pair of longer barreled Model 360 revolvers that briefly made an appearance in the Smith & Wesson line, but they have since been discontinued. One was the M&P360 and the other was the Model 360 Kit Gun.

The M&P360 revolver.

The M&P360 was introduced in 2009* as a longer barreled version of this gun. It was still built on a scandium J-frame, but the 3” barrel and fluted stainless steel cylinder increased the weight to 14.7 ounces. This eliminated the problems with bullet walking. 

For this gun, Smith & Wesson went with a Standard Dot tritium front sight from XS Sights. This gave shooters a glowing night sight from the factory if they desired one. The gun had a black PVD coating and the same size grip as the new Model 360.

Another longer barreled gun was the Kit Gun version. This revolver was introduced in 2002 and had a 3.125” barrel. In addition to the longer barrel, it also differentiated itself from the smaller guns with the use of a Hi-Viz fiber optic sight (like the one on the 360 PD) and a Micrometer adjustable rear sight.

* Smith & Wesson’s sale sheet on the M&P360 shows the gun was launched in July of 2009. However, the 4th edition of the Standard Catalog of Smith & Wesson states the gun was shown at the 2007 SHOT Show. It is possible that an early prototype was on display at the SHOT Show, but that the gun did not ship until two years later.

          New Pistol Mag Speed Loader By ETS Group        

ETS Group has made a new speed loader for pistol magazines. It was first seen online back in December. Military Arms Channel posted a video about it in December. ETS had their 3D printed prototype on display at SHOT Show last January.

Well now it is available. The ETS Pistol Mag Loader is one of the first of its kind. It uses a similar loading concept like the speed loaders for the MP5 magazine by B&T. The loader has a slotted tray that you use to lift up to ten rounds from a box of factory loaded ammo. The loader will work with 9mm and .40S&W.

ETS loader


The loader is easy to use. As I mentioned above, you scoop up ten rounds and use the plunger to push the rounds into the pistol magazine. This ability is rather revolutionary. As far as I know, there has not been any other speed loader that can do this. the B&T loader as well as AR15 magazine speed loader work similarly but they are easier since you are shoving rounds into a double stack magazine. When you load an AR15 magazine or MP5 magazine, the rounds are merely shoved straight down into the magazine. You cannot do this with a pistol magazine. The rounds has to slide under the feed lips. Even though there are double stack pistol magazines, the magazine design funnels that double stack into a single stack.

The first release of the ETS Mag Loader had a few failure to launch issues. However an email from ETS was sent out notifying the media of the changes they are implementing now. First of all, the Mag Loader will now be marked with "MADE IN USA". Secondly, the mag loaders will come pre-lubed. There are internal surfaces that should be lubricated for the best performance. See the photo below.

ETS lube

The mag loader is fast check out the video they made.

As I said above, the ETS Mag Loader is available through ETS. It retails for $49.99. Check out our website for when it is available.

          American Tactical's New Omni Hybrid 410 Shotgun        

It’s been a long time coming, but it is now here: the .410 bore shotgun from American Tactical. Called the Omni Hybrid 410, the new AR-style shotgun combines several of American Tactical’s technologies with several years of development.

I got a sneak peek of this gun at a firearms wholesaler show in 2014. Even though I would have liked to seen it on the market sooner, it appears the company took its time to make sure the final product would deliver the performance shooters demand.

Ok, so what is this new gun? At its most basic description, it is an AR-style shotgun that uses a short stroke piston and is built on a polymer and metal “hybrid” lower. It can run a range of 2.5” .410 bore shells and feeds from a detachable magazine. 


The Omni Hybrid 410 uses the company’s own AR15 Omni Hybrid lower receiver. This is a polymer lower that uses an over molded buffer tube insert for additional strength at the weakest part of all standard AR lowers. This over molded insert wraps down from the buffer tube collar and extends to the rear takedown pin hole. This provides a great deal more strength at this critical area than other polymer lowers can offer.

American Tactical Omni Hybrid 410

Other AR uppers can be dropped on this lower. So, if you want to shoot 5.56 NATO or 300 BLK with this lower – no problem.

Upper Assembly

The gun uses an 18.5” barrel and has a birdcage style muzzle device on the end. If you need to add an accessory, the company includes a 13” handguard with KeyMod attachment points. The uppers are built to normal AR specifications and can be dropped onto most other brands of AR lowers that are built to military specs.

Ammo & Magazines

American Tactical states that these guns will run on most 2.5” .410 bore ammo including birdshot, buckshot and slugs. However, the company discovered during its testing that some of the ammunition on the market is advertised as 2.5”, but is actually longer. These longer shells may not run reliably in these guns.

American Tactical recommends measuring the shells of any ammo you purchase to ensure it conforms to the 2.5” length. Of course, the company also sells its own 2.5” .410 bore ammo. I would imagine these guns will run those loads just fine.

A single 5-round magazine ships with the Omni Hybrid 410. Additional 5- and 15-round magazines are available. Frankly, who doesn’t want a 15-round shotgun magazine to run in one of these guns?


When I first saw the prototype gun almost three years ago, I was told that the shotguns would have a MSRP of $650. As time passes, costs tend to increase. So, I was pleasantly surprised to see the suggested retail price of the shipping guns was less than the projection from years ago. The actual shipping MSRP is $599.95. Well done, American Tactical.

          High-Rollers: Why are roller-delayed guns so pricey?        

Roller-delayed guns were born from the frantic last moments of the Third Reich during the Second World War. At the time, the Wehrmacht was in desperate need of automatic weapons to arm the remnants of their once-vast military, in a last-ditch attempt to stop the flood of Soviet forces.

Bolt-action firearms and traditional locked-breech designs were considered far too expensive, as they used much more raw material and required longer to manufacture. Thus, German engineers attempted to find new, cheaper, faster methods of delaying the action on an automatic weapon. One of these prototypes was the MKb 42 Gerät 06. It utilized a modified version of the infamous MG-42’s short-recoil operation alongside a roller-delayed bolt carrier.

While this design was never produced in appreciable numbers, it lived on in the CETME program, and later in the HK G3 and MP5 designs we know today. But, given that the design was originally championed as an inexpensive alternative to traditional locked-breech designs, why then do these guns demand the premium they do today?

Well, unfortunately, only one company in America is dedicated to solely building roller-delayed firearms – Industries. Their firearms are pretty reasonably priced, with base models retailing around $1,000.

The reason those aren’t any cheaper, is that they don’t enjoy the same economy of scale as the AR-15. Simply put, the demand isn’t high enough for the PTR, to reduce its cost below the $1k mark. Fair enough, but then why do companies like Zenith and H&K charge huge premiums for guns utilizing the same internal operating method?

For H&K, their guns demand a premium because of both brand recognition, and because they are imported from Germany. Yes, their quality control is excellent, and the guns are very well made, but whether they are actually worth double or triple that of a PTR, is yet to be seen.

Zenith Firearms, appears to split the price difference between surplus and new production, presumably because they use a labor force that doesn’t demand the same premium as those found in Germany or America. Heck, Zenith even uses HK-licensed machines to build their HK-pattern firearms.

I also believe one reason that they demand such a premium, is that the methods that were once inexpensive to employ around the middle of the 20th century, are no longer so.

One thing is for certain, roller-delayed have a certain allure to them that makes them feel futuristic and exotic, some six decades after their invention. Plus, their reputation for reliability ensures that shooters will pay these prices happily, knowing they’re getting quality, reliable firearms.

          CZ Custom +20 rd Scorpion Extension        

The popularity of pistol caliber carbines (PCC) has increased exponentially. This is in part due to USPSA's inclusion of PCC into its own division. So now you can race with your PCC and be officially scored. USPSA has been dominated by handgun since its inception, however the market and popularity of pistol caliber carbines has caused USPSA officials to give them a serious consideration into competitive use.

I dabbled in PCC a few years back when I was living in New York state. I used my KRISS Vector carbine in local matches while waiting for my pistol permit to be approved. Pistol Caliber Carbines are quicker and easier to shoot than a handgun. Due to the fact that you are shooting from a more stable position with at least four points of contact: support hand, firing hand, cheek and shoulder.

Lately I have been shooting a lot more PCC and specifically my CZ Scorpion carbine. I even had some prototype +10 rd magazine extensions made to get 40 rounds in my Scorpion Mags. Well CZ Custom shop has done even better. They made an aluminum magazine extension that adds 20 rounds to your 30rd magazine. Giving you 50 rounds of 9mm.

Some of you may wonder if the magazines can hold up to all those rounds. So far I have not heard of these having issues. There are a handful of factory mags that have had cracked feed lips but CZ USA has been great in replacing them at no charge.

Why would you need 50 rounds of 9mm? Well its not a matter of need but a matter of speed. Reloading takes time. Yes there are some very skilled shooters who can reload faster than I can type the word "reload' but if you can eliminate the time needed to reload then you will be faster than someone who has to reload. This is why I wanted 40 rounds in my Scorpion magazines.  Most USPSA stages have a 30 round count on average. Some stage designers like to increase that number for the open shooters who maybe rocking 29rd STIs or SV Infinities. So 40rds is an ideal number for me. 50 is a bit more than 40 but you know you will never have to reload in a stage unless something happens to the magazine.

The CZ Custom magazine extension come with an extended spring, that is actually the most crucial part of the extension. You can have a long extension but if you don't have a spring that will push the rounds up into the gun, it is useless. The mag extension retails for $73.39 on

          Shooting the New Avidity Arms PD10        

Avidity Arms is getting very close to shipping its new PD10 pistol. Developed under the supervision of noted firearms trainer Rob Pincus, a working pre-production model was available for shooting at the 9th Annual Combat Focus Shooting Instructor’s Conference recently held in St. Augustine, FL.

If you are not familiar with the PD10, let me catch you up to speed. Avidity Arms was formed with the initial goal of developing and bringing the PD10 to market. So far, the company seems to be well on its way to achieving this goal, and hopes to ship guns in 2017.

The new pistol is designed for one purpose: self-defense. All of the pistol’s features aim to further that purpose, and Pincus laid out the original design parameters with this in mind. The gun is designed with a full size grip and barrel, but with a thin frame and slide. As I understand it, the gun is meant to be large enough to fight efficiently with, but slender enough to conceal effectively.

To keep things as thin as reasonably possible, the PD10 is built around a 10-round, single stack 9mm 1911 magazine. A proven design, this type of magazine is already available on the market, which offered two benefits: (1) the company did not need to spend engineering time developing a proprietary magazine and (2) consumers have ready access to spare magazines from a variety of manufacturers.

Additional features of the PD10 include:

  • striker-fired, double action only system
  • AmeriGlo Luma front sight
  • I.C.E. Claw emergency manipulation rear sight
  • polymer frame with 1” width
  • lifetime service contract

I had a chance to see one of the first PD10 prototypes at a National Association of Sporting Goods Wholesalers show in 2014. Although it was a very rough prototype, I could immediately tell I liked the general concept. The gun was large enough to fill my hand, but thin enough to be relatively easy to conceal. I’m sure both companies will cringe when I say this, but it felt like my idea of a single stack Glock 19. For me, that feeling was a good one. 

If you like the concept of the PD10, but think the grip is too long for your needs, it sounds like the company is also developing a six round model. I don’t have any information I can share about this model (yet), but I suspect that the full size barrel and chopped frame combination could be quite popular with many people.

The target price on this pistol is $499. While there are other very good defensive handguns around this general price point, keep in mind that this one has upgraded sights and uses magazines that are already in common circulation. I think this gun bears watching.

          +10 rd Extenstion for CZ Scorpion Evo3        

40rd mag

Yeti Wurks has been making 3D printed upgrades for the CZ Scorpion Evo3. One of the items they make is a US made 3D printed magazine base plate. They even made a +3 mag extension. I reached out to Eric of Yeti Wurks and asked if he could modify his design and make the extension longer to add 10 rds to a 30 rd Scorpion magazine. He had dabbled with a +10 extension but he said he could not get it to work because he did not have a spring that could work. However I did. So he sent me his first prototype. Unfortunately it did not work. The channel for the extension was in the wrong position so the follower could not slide down into the extension. Eric redesigned it and sent out the new prototype you see in the photos above.

Since it is 3D printed and only a second prototype design there are some issues. There is some slack with the position of the extension in relation to the mag tube. If the extension is pushed all the way rear ward, the follower cannot slide down into the extension. So I shimmed the extension at the front leading edge of the mag tube. Once properly aligned the follower can travel up and down with out issue. The corners of the extension were a bit sharp so I filed them down to allow for smoother operation.

I ran the Scorpion and this specific magazine in a USPSA match just last weeked. It fed the Scorpion Evo3 flawlessy.

While testing the magazine extension, A small hairline crack was seen along the side of the magazine extension. So Eric of Yeti Wurks is going back to redesign the extension andd make it more robust. No details on price or availability just yet.

          Shot Show 2016 Hottest Products        

SHOT Show 2016 was just a couple weeks ago. There were a lot of products and many of them are exciting. Here is a summary of some of the most interesting items.


Cobalt Kinetics AR15 auto mag drop

Probably top of the list for everyone there would be the Cobalt Kinetics AR15. Their AR has a unique feature that has not been done before in the world of AR15s. When the AR fires the last round of a magazine, the empty magazine is automatically dropped out of the magwell. When the shooter reloads with a fresh magazine, the bolt will automatically close when the magazine is seated. This feature will be very popular in the competitive world of 3Gun.


21 rd 27rd Magpul Glock magazines

 On the horizon of magazine development Magpul is following the competitive market. Just like Elite Tactical Systems made a competition length 140mm Glock magazine, Magpul showed their new Glock magazines in 21 rd capacity and 27 rd capacity.



Torkmag 50rd AR15 magazine


Torkmag has an interesting roller follower design. While there have been roller style followers in the past, the Torkmag is different than the rest. Due to the dual spring design, coil and clock spring, there is more room for more ammo. Their standard size AR15 magazine holds 35 rounds. And a ½ inch longer than a 40 round magazine and their magazine can hold 50 rounds.


Hexmag AR10 magazine prototype

Hexmag showed off their AR10/SR25 prototype magazine. Hexmag has been wildy popular and now they are expanding into .308 magazines.



With regards to AR15 accessories, the PDW style stock can be seen everywhere. Originally designed by North Eastern Arms, the NEA Compact Carbine Stock concept has been copied by so many. Troy Industries came out with their M7A1 stock. One downside to that design and the NEA version is that they require a proprietary bolt carrier group/buffer. So companies like MVB Industries made a version that is compatible with any bolt carrier group for an AR15. Troy has followed suit with their release of the Tomahawk Stock. Similar to the M7A1 PDW design, it is a collapsing stock. Strike Industries has come out with their take on the PDW stock. Their version has a spring release so when you deploy the stock from its collapsed position, it shoots out automatically into the deployed position. They plan on making two versions. One compatible with Troy/NEA BCGs and normal BCGs. Falkor Defense has their own take on the collapsing PDW stock. 

Falkor Optimus stock collapsed

Their Optimus stock uses a single rod, compared to NEA style dual rod systems, and it unfolds to a carbine length stock.

Falkor Optimus stock deployed


For classic firearm afficionados Hill and Mac Gunworks has released their reproduction Sturmgewehr STG44. A couple years ago GSG made a .22LR replica of the infamous STG44. For those not familiar with history, the STG44 was designed by the Nazis at the end of WWII. It predates Kalashnikov's AK47 and Stoner's AR15. The STG44 is known to be the predecessor of the modern assault rifle. Well HMG's version will be in an assortment of calibers. It will be made in its original caliber of 7.92x33 Kurz but for those wanting a more common caliber HMG will make them in 5.56x45 , 7.62x39, and .300 BLK. Cost is a bit high at $1799 but if you compare how much a real STG44 costs, it is a real bargain.


SilencerCo brought out their Maxim9 integrally suppressed handgun. When SilencerCo first announced the Maxim9, it was built onto modified Smith & Wesson M&P9 handguns. They have switched focus and decided to build the Maxim9 around the ubiquitous Glock 17 magazine. With the popularity and availability of Glock17 magazines, it made sense to design the gun to use them. The Maxim9 will also be cut for Glock sights. They are looking into cutting the Maxim9 for RMR MRDS and integrating a light/laser system.

SilencerCo Maxim9 handgun



OSS suppressors

With regards to suppressors, none were as impressive as OSS Suppressors. While they are not new to SHOT Show 2016, they have reached out to get their product and name better known. They rented the Battlefield Vegas range for the entire week of SHOT Show. My friends and I got to go and test their system. We were inside a small indoor range with only about 8 lanes. It felt too small to be shooting such guns as full automatic belt fed M249 and Barret MRAD .338 Lapua Magnum. But my concerns were assuaged once we started shooting the guns with OSS suppressors. We were like kids in a candy store. The back wall was covered in rifles outfitted with their suppressors. There was even a .50 BMG rifle. Strangely none of the guns available to test were actually normal civilian weapons. My friend wanted to try a 16” bbl AR15 and the closest thing they had was a 14.5” SBR. Most of the guns were full auto. Shooting a .338 Lapua inside this small space was the most impressive. We got a chance to take our hearing protection off momentarily to hear the Lapua. It sounded like an unsuppressed .22LR. We were required to wear hearing protection because there was another group at the first 4 lanes and they were shooting unsuppressed.

          Lebanon Politics        
Michel Aoun comes home to roost

by Sami Moubayed [from Asia Times May 13, 2005]

DAMASCUS - Michel Aoun's return to Lebanon on May 7, after 15 years of exile, is yet to shake the political landscape of Lebanon. To some, it is a great victory, to others, a humiliation and a bitter reminder of civil war memories that many people have been working hard to forget.

Aoun returned to Lebanon on the offensive, hateful of everyone and everything that kept him in exile for so long, promising destruction of the existing order and sweet revenge. The Beirut he entered last week was very different from the war torn one he left behind in 1990. That Beirut did not have a Rafik Harrri hallmark on it. Yet, all the actors of Beirut 1990 are still there.

Former president Amin Gemayel, who appointed Aoun prime minister in 1988, upsetting tradition in Lebanon because Aoun was a Maronite, is still there. Patriarch Man Nasrallah Boutros Sfeir, who worked for Aoun's downfall, is also still in religious office. Ex-prime minister Salim al-Hoss, who led a rivaling cabinet in 1989-1990, is there, and so is Samir Gagegea, who Aoun had viciously fought in the eastern districts of Beirut. The general who had been chief-of-staff and who had orchestrated Aoun's exodus from Baabda Palace, stands today in Baabda Palace, the legitimate and internationally recognized president of the Lebanese Republic.

At Beirut Airport, Aoun told the masses, most of whom were too young to remember the civil war, Lebanon will never be governed again by the "political feudalism" and "religious system that dates back to the 19th century". He called for an end to "old fashioned prototypes which represent the old bourgeoisie which persisted without questioning". Aoun has effectively promised to strike back at the entire political establishment of Lebanon. Will he succeed?

Before returning to Lebanon, Aoun promised a "tsunami" in Lebanese politics. Aoun's first encounter with the press and well-wishers at Beirut Airport was less than diplomatic. Annoyed at all the commotion, the ex-general barked at those welcoming him, claiming they were noisy. Once a military man, always a military man. Aoun was never a politician and never had direct contact with the Lebanese public. When people started seeing him as a national leader in 1989-1990, he was too busy with his war against Gagegea and Syria to engage in populist politics. The security situation in Lebanon also prevented him from doing that. He never staged rallies during his career in Lebanon, but rather, was always confined to the barracks, living the life of a professional soldier.

The average age of his supporters is 20, an age where young men and women are full of life, and easily enchanted by Aoun's fiery speeches, which he gave from his exile in France. A generation hungry for reform and hope, they supported Aoun as an exiled leader. Now that he has returned to Lebanon, and engaged himself once again in the dirty game of