Two Bright Light Inc.
Two Bright Light Inc.
Two Bright Lights started out as a photo sharing wesite for wedding/event photographers to provide and create marketing collateral that could be used by other event vendors such as florists, caterers, event planners, etc.
The website quickly gained traction within the photographer and vendor community and thousands of photographs were being shared between the photographers and vendors.
The website was quickly going to grow beyond the user and transaction base that was originally envisaged and designed for and it was clear that the application required to handle the larger scale before it became a problem.
The larger scale would naturally mean increased costs and there was a need to monetize the website to ensure that the increased scale was adequately funded.
The solution was to transform the site into a submission platform for photographers to online and print publications. Photographers gain value by having their photographs featured in publications and thereby increasing their visibility and word-of-mouth referrals from the subjects of the photographs.
Editors of publications gain value by finding cache of photographs from real events featuring real people and not having to repeat stock photographs customized to their requirements.
Further as the website gained popularity there were partner sites that required photographs to be imported in bulk so that their users did not have to make multiple uploads/submissions. This process had to be transparent to the user and minimize their involvement to be user friendly
Ensure that photographs were uploaded quickly without failure and an ability to resume in case of loss of connections.
Ensure that upload and download of photographs which consume a lot of bandwidth and background server capacity does not affect the functioning of the regular website activities.
Ensure fast processing of user requests for information-particularly on the editor's side who are sifting through hundreds of submissions.
Ensure that bulk import of photographs from partner sites were carried out seamlessly and without interrupting the user experience.
Original Images from photographers are High Quality and therefore large sized. It was imperative to ensure that the sizes were retained for final delivery to publishers, but at the same time, the large size should not slow down the speed at which the images are delivered to the editors while searching through the submissions.
To meet the above challenges and to deliver the ideal solution for the client we looked to the solutions available through the Amazon cloud.
The first task was to identify the resource intensive processes and identify how the load could be split between the various activities. Image upload and download was identified as the biggest consumer of resources in terms of server capacity and bandwidth. This was followed by serving up of submissions to the editors.
The next task was to identify how the processes could be optimized to provide the best user experience. It was clear that multiple servers would need to be in place to handle the activities. The split was decided as follows. The main application server would only host the code and handle the requests from users that involved database interaction. There would be a standalone database server (with replication implemented using a master-slave methodology) to handle all database requests. Uploads and downloads would be handled on a stand-alone server which would be seamless to the user for all interactive purposes. This would take the heavy image processing load off the main application server as these activities do not require a real-time response and can be undertaken on a batch processing basis. All images would be stored on the Amazon S3 cloud to take advantage of their CDN (Content Delivery Network) for quick delivery of images to users without taxing the load on the main application server to serve up static content.
The final task was to create a background process to handle bulk processing activities like processing uploaded images to create web-friendly image sizes while retaining the original HD pictures for final delivery; batch processing of download requests and bulk imports from partner sites.
By separating dynamic and static content delivery, the main application server was able to respond to requests much faster and static content delivery from the CDN ensured that pages and content would load much faster for the end user.
Having a separate server handle upload and download requests ensured that there were lesser interruptions in the upload/download process as well as reduced load on the main application server serving up dynamic content.
Web-friendly versions of images served up from the CDN ensured that editors could browse through images much quicker and obtain the high quality images on demand only for their shortlisted images.
This infrastructure has ensured that the website is now able to handle over 40000 active users and 7.5 million images successfully and provide an unmatched platform to connect photographers with publishers.
HTML5, CSS3, jQuery 1.9.1 & Bootstrap
S3 bucket and EBS
Smug Mug API integration Dropbox API integration Madmimi API integration Facebook share API integration Twitter Post API integration
On going work: HP mac cloud and shoot Q API integration