Beruflich Dokumente
Kultur Dokumente
Summary
I am a curious software developer with more than 10 years of experience building applications in different
languages and technologies. In my first 5 years I did pixel perfect web design with high interactivity (UX using
flash & javascript) for several design agencies. Now I do robust, "intelligent" (machine learning), scalable, high
performant applications living in the cloud.
Currently I work closely with the CEO, CTO, and managers to help conceiving, planning, designing, and
developing software applications and product strategies. I am a creative thinker and I like to be shooting
grounded ideas all the time, always thinking about the value they bring, the complexity that it adds, and the
effort it will involve.
I started coding at the age of 16 (now 27), since then I launched production systems with the following
languages and technologies:
Languages: Java, actionscript, javascript, objective-c, python, php, bash, ruby, sql, scala.
Technologies: Mysql, linux (many flavors), nodejs, mongodb, hadoop, mahout, activemq, aws, rackspace, rails,
redis, couchdb, mongodb, solr, lucene, apache, hive, cassandra, memcached, play, nginx, arduino, dns...
ML: k-means, clustering, collaborative filtering.
In the past I had built from scratch several web-based applications including audio editors, game engines, web
scraping frameworks, touch screen applications, online video processing applications, image galleries, video
portals, and tons of products.
I keep an eye on the tech media, and I follow several big company tech blogs like netflix, facebook, google, and
many others to keep up with the cool design ideas they implement.
Most of my work has been remotely.
I also maintain an open source super scalable web scraper, with it, I have scraped more than 1 billion pages.
https://github.com/calufa/tales-core
Old design reel: http://vimeo.com/11221914
-- Software as a craft
Page1
Experience
Stealth Startup at Stealth Startup
February 2013 - Present (1 year 2 months)
CTO at GigaLab
December 2011 - March 2013 (1 year 4 months)
GigaLab offers daily data harvesting, aggregation, transformation and visualization services for any industry.
Whether it's continuously scouting the web for relevant facts, aggregating company-internal databases or
custom consulting projects.
In Giga-Lab I am in charge of planning, developing, maintaining and scaling all the applications in a agile
and lean environment.
In Giga-Lab we collect, move and visualize gigabytes of data, with a fully automated system. We currently
scrape and process 2 million pages per day happily.
Giga-lab have a contract with the largest department store in a Latin America, a contract that covers 4
countries.
About our Scraping System
Its built from scratch in Java. In our system servers are nodes. A node can be use for scraping, map/reducing,
and/or storing data, among other operations. Nodes listens to git for changes. Nodes are in charge of autosync
and autocompile source code (from github), this way we can have everything using the lastest version
running. Nodes can also create and destroy other nodes using a http api. Nodes also backup themselves and
failover (in case of a ip block while scraping).
We can scrape using multiple threads/spiders at the same time, across many servers, without automatic
failover, and other sugars.
Technologies
We use git (git-flow), java, ruby/rails, node.js, mysql, solr, redis, hadoop, aws-ec2, aws-s3, rackspace, ant,
apache, javascript, ubuntu.
CoFounder & CTO at ScramblerMedia
August 2011 - October 2012 (1 year 3 months)
We (2 friends) are building a web based video game engine and platform. This engine uses a series of jsons
that builds everything, from the interaction between the characters, stage design, scoreboards, to the design of
the web pages. Everything is configurable and super dynamic.
This project has 2 fundamental pieces. The first piece is a swf game engine built from the ground up that
Page2
reads a series of jsons and swf sprites to build the game stage, interactions and gameplay -- a interesting fact
is that the engine file size is not more than 50kbs. This engine uses a lot of the concept of interfaces, and that
allow us to create abstract characters that can extend or implement behaviors -- for example, the enemies and
the hero extend from the character object, and the character object can have behaviors like, ways to die, forms
of attack, ways to move, etc.
The second piece is the webserver made in node.js, where we handle things like analytics, and the creation of
the html that will serve every game - every game can have its own css and html design and every game stage
has its own url to maximize SEO exposure.
One of the ideas of the project is that the games should be easy to deploy, and thats why we use dropbox to
sync everything. Every game has its own "dropbox folder", and this allow us to have games separated from
each other. Only the server knows all the games, as he is in charge of building the pages for them. With this
approach animators can have a better user experience, as its super easy to build and deploy the games. The
jsons are also super intuitive and clean.
We evolve our engine with feedback provided by a custom analytic system. Our game is currently running in
fb, newgrounds, and other portals -- all the website are using the same game swf; we dont believe in
fragmentation!
DEMO: http://scramblermedia.com/charliesHardWorkDay
Data Scientist at CORE
August 2010 - December 2011 (1 year 5 months)
[Personal Project]
CORE mines millions of users on social networks, websites and other forms of data using a super scalable
system built on top of AWS or/and Rackspace.
The app is a fully scalable & fault tolerant system capable of collecting and parsing up to 150+ pages per
second on a 2GBram Ubuntu server with a cost of less than 10 dollars per day.
It is totally abstract. This means that it can parse any public web-based website by using a parser-class and
take full advantage of the server capabilities automatically. It can also scale to multiple servers in multiple
availability zones.
I tested the system on twitter, and scraped 200 millions tweets from 16 millions users which was about
173GBs. Full story: http://news.ycombinator.com/item?id=2633384
Technologies
java, node.js, d3, solr, aws, rackspace, git, mysql, redis, munin.
Page3
youtube didnt had at the time), you could queue videos, and watch later playlists.
Buffer.me was all about good UX and UI, and good code.
Demo:
http://vimeo.com/1702691
This project is currently offline.
Thanks google for blocking me!
Technologies
alot of javascript, alot of actionscript and alot of ui design, php, centos.
Lead Developer / First Engineer at Bandbox
January 2007 - June 2009 (2 years 6 months)
Bandbox is a free service which helps signed and independent artists discover new ways of selling digital and
physical products online. Bandbox uniquely leverages advertising to pay artists and music organizations
unprecedented margins on the sale of their products.
My responsibilities included:
- Building and maintaining a XML-Driven flash widget that help artists sell their music online.
- Developed the communication logic between the widgets and the server.
- Developed a Flash based application that allow artist to edit audio online.
- Code a iPhone application that allows users to see the current status of the favorite artists.
- Prototyped a application that allow artist to create iphone applications effortless.
- Give fresh ideas using demos or concepts using emerging technologies with new design tendencies.
Bandbox was used by artists like Taylor Swift, Garth Brooks, among other artists.
Technologies
actionscript, php, UI/UX, objective-c, java, ffmpeg.
1 recommendation available upon request
Developer at NRG Agency
2007 - 2009 (2 years)
- Flash, XML and PHP, ideas, design, animation.
Web Developer at 123Flickr
April 2007 - July 2007 (4 months)
[Personal project]
Page5
Because Flickr did not work well as a professional portfolio, I decided to do 123Flickr.
123Flickr is a 1 click gallery creator that generates beautiful highly interactive flash galleries that you can
embed into your website or social network profile, without any setup, account creation, or need for image
uploads. Just provide your Flickr username and 123Flickr will create you a nice looking gallery for you to
share with your clients and friends.
123Flicker was featured in Mashable, and it was used by more than 5,000 photographers as their personal
portfolio before stoping the project.
http://mashable.com/2007/06/17/123flickr/
http://mashable.com/2007/06/28/host123flickr/
Technologies
php, javascript, actionscript, centos, DNS. I was experimenting with this new idea called "apis".
Flash Designer & Developer at NOVAQ
July 2006 - June 2007 (1 year)
- php, design, HTML, CSS, and actionscript.
Languages
Spanish
English
Page6
SOA
System Architecture
REST
Solr
Web Applications
Education
Veritas University
3d Animation, 2003 - 2005
Interests
holism, scalable systems, smart applications, artificial intelligence, user experience, usability, prototyping.
Page7
Carlos Chinchilla
Solutions Architect, Big Data, Consulting, Startups
calufa@gmail.com
Page8