Tag Archives: Tibco

Learn Apache Camel – Indexing Tweets in Real-time

via Learn Apache Camel – Indexing Tweets in Real-time | Building scalable enterprise applications.

There’s a point in most software development projects when the application needs to start communicating with other applications or 3rd party components.

Whether it’s sending an email notification, calling an external api, writing to a file or migrating data from one place to another, you either roll out your own solution or leverage an existing framework.

As for existing frameworks in the Java ecosystem, on one end of the spectrum we find Tibco BusinessWorks and Mule ESB, and on the other end there’s Spring Integration and Apache Camel.

In this tutorial I’m going to introduce you to Apache Camel through a sample application that reads tweets from Twitter’s sample feed and indexes those tweets in real time using Elastic Search.

Video: Build Your Own Interface with JavaScript APIs

In the modern world of ever-changing Fast Data, users need an information delivery platform that’s nimble and responsive enough to keep pace with the flood of data flowing through their systems. TIBCO Live Datamart is here to fulfill that need and with it comes numerous ways you can tap into your information streams to gain real-time insights into your business.

Live Datamart 2.0 provides JavaScript API. Developed from the ground up with goals of simplicity and integration with modern JS frameworks, Live Datamart JavaScript API enables our customers to easily write web applications that can be viewed on any modern device with a web browser. Using the API, you can turn your tables, grids, charts, plots, gauges, and maps into live components that change with your data.

In the following video, I’ll guide you through some of the key features of the API, introduce you to its documentation, and demonstrate how easy it is to create visualization components driven by TIBCO Live Datamart.

Java Backend Developer

Role : Java Backend Developer
Location: Beaverton, OR

Note: Interested candidates can send me there updated resume at kdinesh@prokarma.com or you can also reach me at (402) 905 9212. Please share or like this post.

Needed for this Position
• Experience developing applications to run in a large-scale environment.
• Experience integrating and designing NoSQL technologies inside a real-time analytics stack. This can include MongoDB, Cassandra, HBase or Couchbase.
• Experience working with real-time streaming data service technologies such as Kinesis, Kafka, or Flume.
• Experience with distributed real-time computational (CEP) tools like Storm, Tibco Streambase, etc
• Experience designing and developing on horizontally and highly scalable cloud-based architecture, AWS cloud infrastructure experience strongly preferred.
• Experience developing in a continuous integration environment using Jenkins, Bamboo, or TeamCity CI frameworks.
• Experience writing automated unit and integration tests using JUnit or TestNG testing frameworks.

Preferred if you have:
• Experience working in an Agile development environment.
• Experience developing ecommerce based web applications.
• Experience with cloud DevOps tools like Puppet, Chef or Vagrant.
• Experience with version control tools such as GIT/Stash or SVN.
• Experience with Linux (CentOS, Ubuntu, etc).
• Experience using and integrating with software & logging analysis tools such as New Relic & Splunk.
• Experience developing in a TDD environment.
• Experience with indexing technologies such as SOLR or Elastic Search.
• Experience working with Hadoop or other MapReduce technologies.
• Experience in a Blue/Green deployment model.
• Experience in a continuous delivery environment.