MetaForce: A MetaTrader4 Integration, Opening MT4 to the Web

We know we’ve been silent here lately, however we are happy to announce our full revamp of one of our products: MetaForce. MetaForce is a product we’ve had around for a while and have several clients using.

What is MetaForce? MetaForce allows MetaTrader4 data to be extracted to numerous CRMs, support systems, and custom client areas. With MetaForce brokers can do things they never have been able, a few highlights are:

  • Process deposits automatically to MT4 via their payment processors
  • Manage MT4 accounts from their CRM, support system, or client area
  • Allow clients to reset their MT4 passwords from the web

There are two levels of MetaForce. One which syncs data from the MT4 platform into a CRM, support system, or client area. The second which does the data sync, but also allows interactions back from the CRM, support system, or client area into the MT4, such as creating accounts, deposits, etc.

MetaForce is the first product of its kind. No longer are brokers required to use the MT4 programs to manually process account applications, deposits, and other business processes; brokers can now use their own platforms to do these actions. With MetaForce brokers can streamline their processes and cut down on training time.

For more information on solutions please visit the product’s site.

Symfony2 – Getting All Errors From a Form in a Controller

Recently I was working on an API call which uses a form to validate the data you are passing in. I ran into the issue of getting all the errors for a form in Symfony2 is not as straight forward as you would think. At first I tried `$form->getErrors()` and then looped through the errors. That didn’t get all of them. After looking around I found:

This works really well for any errors which are bound to a field. However it will not catch global errors such as a unique validator. It should probably be renamed from getAllErrors(). In order to get those you need to also loop through $form->getErrors(). This was returning the global errors only for me. Here is my code in the end:

There may be a better way, just wanted to shoot this out as not many people had good solutions on it.

Bonus: If you are using the translator service on validators and you get an error which is the ‘validators’ translation files, make sure you use the proper domain, ie: $translator->trans(‘key’,array(),’validators’).

AJAX Request Slow With PHP? Here’s Why

Recently I was working on a project where we had a page which loads tons of data from numerous sources. I decided after a while that we wanted to AJAX each section of data so that the page would load a bit quicker. After splitting up the requests and sending them asyncronously, there was little improvement. I thought at first it may be due to the fact we were pinging a single API for most of the data multiple times, that wasn’t it. Maybe it was a browser limit? Nope was still far below the 6 requests most allow. I setup xdebug and kcachegrind and to my surprise it was the session_start() that was taking the most time on the requests.

I looked around the web for a while trying to figure out what in the world was going on. It turns out that PHP’s default session_start will block future session_starts for the same session until the session is closed. This is because the default method uses a file on the filesystem which it locks until you close it. If you want more information on this and how to close it you can read a bit more here.

We switched over to database based sessions and it fixed it. In symfony 1.4 the default session storage uses the file system, however switching over to sfPDOSessionStorage is very easy and quick.

Deleting files older than specified time with s3cmd and bash

Update: Amazon has now made it so you can set expiration times on objects in S3, see more here: https://forums.aws.amazon.com/ann.jspa?annID=1303

Recently I was working on a project where we upload backups to Amazon S3.  I wanted to keep the files around for a certain duration and remove any files that were older than a month.  We use the s3cmd utility for most of our command line based calls to S3, however it doesn’t have a built in “delete any file in this bucket that is older than 30 days” function.  After googling around a bit, we found some python based scripts, however there wasn’t any that was a simple bash script that would do what I was looking for.  I whipped this one up real quick, it may not be the best looking but it gets the job done: