AddThis Feed Button

Recent Items

Tags: Apex, SOQL

I had to transfer opportunities and tasks between users today.

I noticed that the “Mass Transfer Records” option didn’t let me do it. I also found a couple of AppExchange apps that offered to do it, but I hate the hassle of installing foreign code into my own system.

So, I just did it myself via the System Log. I’m an old-fashioned sort of guy, so I even did it via the “old” version of the System Log. In case it’s useful for other people, here’s my code:

Opportunity[] opps = [select Id from Opportunity where OwnerId = '005200000010rF5' and StageName = 'Open'];
for (Opportunity o : opps) {
  o.OwnerId = '00520000003EtsZ';
}
update opps;

Of course, you’ll probably have different criteria than this, but it’s pretty straight-forward.

Here’s how I transferred opportunities that were ‘Not Started’:

Task[] tasks = [select Id from Task where OwnerId = '005200000010rF5' and Status = 'Not Started'];
for (Task t : tasks) {
  t.OwnerId = '00520000003EtsZ';
}
update tasks;

The Bottom Line

  • You can’t transfer Opportunities nor Tasks using in-built tools
  • You can do it easily via some quick Anonymous Apex in the System Log
  • I’m an old-fashioned kinda boy

I received a notice from my friendly Salesforce rep recently, advising that I had gone over my storage limit:

The last time I had heard from Salesforce on such a matter was when Chatter went wild and took me to 493% of my storage allocation! Oh, you’ll also notice from the picture in that article how much my ‘Contact’ record storage had grown over the past year!

This time, my rep kindly offered to raise an invoice for the additional storage space. I’m cheap at heart, so I decided instead to reduce my storage space. Not that I’m upset at Salesforce — I know it’s expensive to store data in their system because it’s all replicated between data centers, backed-up, etc. However, I knew that a lot of my data was unnecessary, and I could just dump it.

To explain, I populate my Salesforce instance from an external system. I had over 220,000 Contact records, of which only a subset were required. So, I decided to remove Contact records:

  • For people who don’t own any of our products (defined in a custom field)
  • For records with no Activities

So, I ran Data Loader (actually, the Mac version which is LexiLoader, compliments of Simon Fell, who reminds people to vote for his Idea that Salesforce produce an official Mac version) and extracted a list of contacts who don’t own a product.

I then ran another Data Loader extract to get a list of all Activity records.

Next, took the first list of contacts and subtracted any contacts associated with the Activity records. (I couldn’t figure out how to do this in one SOQL statement, suggestions welcome!)

Finally, I took the list of record IDs and asked the Data Loader to do a bulk delete of the records. It took my storage way down:

I must say, the bulk delete operation was extremely fast, since the Data Loader uses the Bulk API for such operations.

The ‘Oops!’ moment

Things seemed fine until a couple of days later when my users reported that they had records with Activities that had been deleted. I went back and checked my work, only to discover that I made an error in my “subtraction” step. Instead of taking all contacts and removed all IDs that matched a list of contacts that had Activities, I subtracted the list of Activities themselves. Since these objects have non-overlapping Ids (that is, no Activity IDs matched any Contact IDs), that operation did nothing.

End result: I deleted a lot of useful records. Gulp!

I did some searching and found rumors that Salesforce could undelete records, but charge a lot of money for the privilege. Not great, since it would cost more than I had originally tried to save!

Next, I investigated the Recycle Bin. Here’s what the official documentation says:

The Recycle Bin link in the sidebar lets you view and restore recently deleted records for 30 days before they are permanently deleted. Your recycle bin record limit is 250 times the Megabytes (MBs) in your storage. For example, if your organization has 1 GB of storage then your limit is 250 times 1000 MB or 250,000 records. If your organization reaches its Recycle Bin limit, Salesforce automatically removes the oldest records if they have been in the Recycle Bin for at least two hours.

My limit actually is 1GB (because we only have a small number of users, so we get the minimum size). Therefore, I get 250,000 records. Given that I deleted about 220,000 records, it means they’re all still in there!

I started to use the Recycle Bin ‘undelete’ function, but doing 200 at a time means I’d need to do it 1000 times!

So, I next tried some Apex in the System Log window, like this:

Contact[] c = [select id from contact where isDeleted = true LIMIT 1000 ALL ROWS];
undelete c;

However, some records didn’t want to undelete because our external system had already Upserted replacements and undeleting some records would have caused a clash of unique fields. And if this happened, the whole undelete was rolled-back rather than allowing through the non-clashing records. Argh! So, I then went to something a bit more sophisticated:

// Get a list of Contact records to delete
Contact[] contacts = [select id, EmailAddr__c from contact where isDeleted = true limit 1000 ALL ROWS ];

// Put the Email addresses into an array
String[] emails = new String[]{};
for (Contact c : contacts) {
  emails.add(c.EmailAddr__c);
}

// Get a list of 'alive' Contacts (not deleted) that already use that email address
Contact[] alive = [select id, EmailAddr__c from contact where EmailAddr__c in :emails];
system.debug('Found: ' + alive.size());

// Make a list of Contacts to delete
if (alive.size() != 0) {
  for (Contact c : alive) {
    for (Integer  i = 0; i < contacts.size(); ++i) {
      if (contacts[i].EmailAddr__c == c.EmailAddr__c) {
        contacts.remove(i);
        break;
      }
    }
  }
  system.debug('Will undelete: ' + contacts.size());

  // Delete them!
  undelete contacts;
}

I should explain the EmailAddr__c thing. You see, Email is my external ID. However, I couldn’t use the standard Email field as an External ID because I can’t force it to be unique. So, I have a second field for Email address and I populate the both. For more details, see my earlier blog post.

Anyway, the above code took about 2 minutes for 1000 records:

10:11:19.031 (31752000)|EXECUTION_STARTED
10:11:19.031 (31788000)|CODE_UNIT_STARTED|[EXTERNAL]|execute_anonymous_apex
10:11:19.032 (32365000)|SOQL_EXECUTE_BEGIN|[1]|Aggregations:0|select ...
10:11:19.074 (74698000)|SOQL_EXECUTE_END|[1]|Rows:1000
10:11:19.202 (202887000)|SOQL_EXECUTE_BEGIN|[6]|Aggregations:0|select ...
10:13:07.266 (108266842000)|SOQL_EXECUTE_END|[6]|Rows:157
10:13:07.267 (108267315000)|USER_DEBUG|[7]|DEBUG|Found: 157
10:13:15.949 (116949306000)|USER_DEBUG|[19]|DEBUG|Will delete: 896
10:13:15.950 (116950156000)|DML_BEGIN|[20]|Op:Undelete|Type:Contact|Rows:896
10:13:19.937 (120937987000)|DML_END|[20]

Most of the time taken was for the 2nd SOQL query (106 seconds), which matches on email. The loop to eliminate duplicates also took time (8 seconds). The undelete itself was relatively quick (4 seconds).

So, I included an ORDER BY clause in my initial query that tried older records first. This resulted in less email clashes, and much faster execution times.

Over the course of a day, I managed to undelete all the records. In fact, it sped up a lot after midnight San Francisco time (which is easy for me because I’m in Australia). Finally, I did my mass delete properly and everybody was happy.

The result:

How to avoid this error in future

Okay, I was doing dangerous stuff and I did it wrong. So how could I avoid this in future? Some ideas:

  • Make a backup first! Extract all data first (but that’s not easy!) or use the “Export Data” function (but that’s not easy to reload).
  • Try it in the Sandbox first. However, we have a Cofiguration-only Sandbox, without all the data. No good.
  • Test before committing the delete. I did pick random records, but obviously not enough.
  • Get somebody else to review my work before deleting.

The last idea reminds me of a quote in Kernighan’s famous book The Practice of Programming:

Another effective technique is to explain your code to someone else. This will often cause you to explain the bug to yourself. Sometimes it takes no more than a few sentences, followed by an embarrassed “Never mind, I see what’s wrong. Sorry to bother you.” This works remarkably well; you can even use non-programmers as listeners. One university computer center kept a teddy bear near the help desk. Students with mysterious bugs were required to explain them to the bear before they could speak to a human counselor.

I used that technique a lot at work. I ask somebody to “be my teddy bear”, tell them my problem, suddenly realize the solution, then thank them for their help even though they said nothing. Works every time!

Irony

Oh, here’s some irony. No sooner did I do the above, then I receive an email from Salesforce telling me that Recycle Bin limits are being cut:

Dear John,

At salesforce.com, Trust is our top priority, and it is our goal to improve the performance of our Recycle Bin functionality. With that in mind, we are making some changes to the Recycle Bin limits to provide you with a faster user experience.

What is the change and how does it impact me?
We are lowering the Recycle Bin retention period from 30 days to 15 days. The Recycle Bin link in the sidebar will now let you restore recently deleted records for 15 days before they are permanently deleted.

Additionally, we are lowering the Recycle Bin record limit from 250 times your storage to 25 times your storage. For example, if your organization has 1 GB of storage then your limit is 25 times 1000 MB or 25,000 records. If your organization reaches its Recycle Bin limit, Salesforce will automatically remove the oldest records if they have been in the Recycle Bin for at least two hours.

When is this change taking effect?
The lower Recycle Bin retention period will go into effect with the Winter ’12 Release.

The irony is that, had these reduced limits been in place, I would not have been able to recover my deleted data. Phew!

The Bottom Line

  • Test or verify before committing large data-related changes
  • You can’t do undelete via the Bulk API
  • The recycle bin is very big!
  • I’m cheap at heart
November 19, 2010
Tags: Apex, Data Loader

I was in the audience at Dreamforce 2009 when Mark Benioff first demonstrated Chatter. Personally, I was non-plussed because my workplace keeps all its corporate knowledge on a Confluence wiki, which includes the ability to add comments on pages and track activities.

So, I never bothered delving into Chatter. I didn’t even turn it on. Nonetheless, the dear folks at Salesforce activated it by default. This resulted in some cute ‘feed’ emails and users started adding a picture to their profile.

After a while, the emails turned from ‘fun’ to ‘annoying’ because we have an automated load process that loads hundreds of records several times a day. So, I found the Customize/Chatter settings in Setup and turned off emails. All done and dusted, right? Wrong!

A week or so later, I get a call from my local Salesforce office. “Did you know that your storage has increased dramatically lately?”

No.

So I looked in Storage Usage and was flabbergastered to see this:

Apparently the system had created over 16 million Chatter “Feed Tracked Changes” records, occupying 4GB of storage. That’s quite impressive given that I’ve got a 1GB data quota!

So, I immediately turned off Chatter and waited for my Salesforce contact to get me something called “Chatter Janitor” that could help clean up the mess. In the meantime, I searched discussion boards for a solution, only to find that other people had the same problem and couldn’t figure out how to delete the records!

Attempt 1: Via System Log

Fortunately I came across a Purge Chatter Feed discussion on the Community forums. It shows how to delete feeds via the “Execute Apex” window of the System log. I’ve simplified their example to do the following:


OpportunityFeed[] feed = [Select Id from OpportunityFeed limit 10000];
delete feed;

Unfortunately, it didn’t work for me. Eventually I discovered that the Feed records are only available if Chatter is turned on, so I had to activate it again.

The above bit of code deletes 10,000 records at a time, which is the maximum allowable under the Governor Limits. Unfortunately, with my 16 million records, this would take 1600 executions of the code. That’s a lot of clicking!

I started doing it and things were moving pretty quickly, until my scheduled batch load activated and I ended up with even more Feed records than when I started. Arrgh!

Then, thanks to some hints from David Schach, I found that I could turn off Chatter on specific objects. I hadn’t realised this at first because the option only appears if Chatter is turned on!

Attempt 2: Anonymous Apex

Okay, hitting the “Execute” button in the System Log 1600 times didn’t sound like fun, so I thought I’d check how to automate it. I soon figured out how to call executeAnonymous via an API call. All I’d need to do is repeatedly call the above Apex code.

I used the SoapUI plugin for Intellij (you can also use it standalone or in Eclipse), which makes it very easy to create XML for SOAP calls and even automate them using a Groovy Script. This worked generally well, but could still only delete 10,000 records per SOAP call and it was taking a while to execute and occasionally timed-out. So, this wasn’t going to be the perfect solution, either.

Attempt 3: Batch Apex

I did some research and found that Batch Apex has no governor limits if envoked with a QueryLocator. Well, it actually has a 50 million record limit, but that was good enough for me!

The online documentation is pretty good and I eventually created this class:

global class ZapChatter implements Database.Batchable<sObject>{

  global ZapChatter() {
    System.Debug('In Zap');
  }

  global Database.QueryLocator start(Database.BatchableContext BC) {
    return Database.getQueryLocator('Select Id from OpportunityFeed limit 1000000');
  }

  global void execute(Database.BatchableContext BC, List<sObject> scope) {
    delete scope;
  }

  global void finish(Database.BatchableContext BC) {
  }

}

Batch Apex works by taking a large number of records and then processing them in ‘batches’ (obvious, eh?). So, in the above code, the start method selects 1 million records and then the execute method is called for every batch. It is invoked with:

id batchinstanceid = database.executeBatch(new ZapChatter(), 10000);

This can be executed in the System Log or via the Force.com IDE ‘Anonymous Apex’ facility. The “10000″ second parameter tells Batch Apex to use matches of 10,000 records.

Things worked, and didn’t work. I found that small batches of 200 or 1000 got executed very quickly. Batch sizes of 10,000 took a long time “in the queue”, taking 3 to 15 minutes between batches, probably a result of other workload in the system.

I then got greedy and tried one batch of 1 million records. This took 50 minutes to start the batch, only to fail with an error “Too many DML rows: 1000000“.

I then selected ALL 16 million records and requested a batch size of 10,000. This took 5 hours before it started the batches, with a total of 1674 batches required. I left it overnight but it didn’t run many batches, presumably because large batches are given low priority.

Attempt 4: Deleting via Data Loader

During all this fun, I lodged a Support Case with Salesforce to obtain their advice. They suggested using the Data Loader to export a list of Feed IDs and then load them in a Delete operation. I also discovered that Chatter has to be activated, otherwise Data Loader will not show the Feed objects (eg AccountFeed, OpportunityFeed).

This method did work. However, the speed of deletion was not great. I was only getting a rate of about 100,000 records per hour, probably due to my low upload bandwidth from home. (Yes, this had already occupied my work day, and was seeping into my evening, too!) At that rate, it would still take 160 hours to complete — that’s a full week!

What’s worse, the Data Loader normally works in batches of 200. This would require 80,000 API calls to delete the records, and we have a limit of 28,000 API calls per day. So, that’s 3 days minimum!

Attempt 5: Bulk API

Since I’m using all these new Apex technologies, I then thought I’d try the new Bulk API. It’s basically Data Loader on steroids — same interface, but with a few options turned on in the Settings dialog.

Bingo! Load speed went up dramatically. The Bulk API uses parallel processing and you can watch your job while it loads! In my case, it was loading 10 parallel batches, chewing up gobs of “API Active Processing time”. I upped my batch size to 10,000 so my test file of 100,000 records loaded in 10 batches. This handy, because there is a limit of 1,000 batches in a 24-hour period. So, 16 million records would use 1600 batches and would need to be spread across two days.

Since I’m in Australia and the speed of light somewhat impacts data transfers, I configured the Data Loader’s “Batch mode” to work from our data center in the USA. Attempting to extract 1 million records timed-out after 10 minutes before even starting the download, so I dropped down to 100,000 records with a maximum extractionReadSize of 2000 (which is the download batch size). This took 4½ minutes to run. The upload took only 6 seconds (wow!) and 7 minutes to run:

I then settled down, deleting in batches of 500,000. Success at last!

The Bottom Line

  • Chatter generates Feed objects when specified fields change
  • If you’re loading lots of records via the API, this might generate lots of Feed records
  • The feed records (eg AccountFeed, ContactFeed, OpportunityFeed) can consume a lot of space
  • If you want this to stop, turn off the individual Feeds (Setup, Customize, Chatter, Feed Tracking) but keep Chatter turned on for now
  • If you’ve only got few hundred thousand records, it’s easiest to delete them via the System Log
  • If you’ve got millions of records, use the Data Loader and Bulk API to extract then delete them
  • When you’re all done, turn off Chatter. Phew!
Tags: Apex, SOQL

We recently reassigned a heap of Opportunities between staff members. However, the previous Opportunity Owner had open Activities on the Opportunities. The new Opportunity owners couldn’t close those Activities since they belong to somebody else.

So, they asked me to find a way to bulk-close the Activities.

This sounded simple, but was made more difficult by the fact that Activities can link to many different object types: Account, Opportunity, Campaign, Case or Custom Object.

The connection is made via WhatID, which is the ID of the associated object. It can be accessed via SOQL like this:

SELECT Id, What.Name from Task

However, not all fields are available, so you can’t SELECT What.OwnerId.

Fortunately, I found a forum post called Getting Object type of WhatId/WhoId Task/Event fields gave me a few hints, and I came up with this code:

// Get list of Opportunities owned by new person
Opportunity[] opps = [Select Id from Opportunity where OwnerId = '005200300014jeN'];

// Get incomplete Activities owned by previous person attached to the above Opportunities
Task[] tasks = [select Id, What.Name from Task where OwnerId = '00520000000tSOj' and Status != 'Completed' and WhatId in :opps];
for (Task t : tasks) { 
  t.Status = 'Completed';
}
update tasks;

This grabs a list of ‘owned’ Opportunities and checks for any Activities (which are actually Task objects) that have a WhatId matching those Opportunities.

Straight-forward and pretty simple. Almost makes up for not being able to traverse directly to the linked object.

The Bottom Line

  • Activities can link to multiple objects
  • They connect via WhatId, but only a limited number of fields are exposed, eg What.Name
  • Use an ‘IN’ comparison to match the Activities with Opportunities
Tags: Apex, SOQL

I run into this problem all the time.

I want to write a quite routine in the System Log window to Mass Update some records (eg my previous Mass Delete via System Log window blog post). I want to find all records before a certain date, but SOQL never likes my date format, eg:

select Id from Opportunity where Expiry_Date__c < 2010-01-01

Yes, it is possible to convert it to Timezone format and do it this way:

select Id from Opportunity where Expiry_Date__c < 2008-01-01T00:00:00Z

but I’ve always thought that silly when comparing against a Date field.

So, I eventually figured out that I can do it this way:

select Id, Name, LastLoginDate from User where LastLoginDate > :Date.valueOf('2008-01-01')

Of course, this only works within the context of Apex, such as the System Log window. It won’t work in pure SOQL tools like SOQL Explorer. Here’s an example:

Opportunity[] opps = [select Id from Opportunity where Expiry_Date__c < :Date.valueOf('2010-01-01')];
System.Debug(opps.size());

The Bottom Line

  • Date strings can’t be entered into SOQL
  • Option 1: Use Timezone format: 2010-01-01T00:00:00Z
  • Option 2: Convert from string: :Date.valueOf(’2010-01-01′)

I was configuring DataLoader to export a list of Opportunities, and I went to select the “Owner Name”. However, only OwnerID is available on an Opportunity.

“No problem!” I think to myself, as I go and create a Custom Field with a formula equal to Owner.Alias.

“What?” I say in surprise. “It won’t let me access a field on the Owner object!”

NoLink

Mmm. This is strange. Then a Google Search reveals an 18-month old Ideas request to Make “Owner Id” Look up fields available for formulas.

Oh dear.

Well, that’s a shame, but it’s easily solved! I created:

  • A field called Owner_Link__c of type Lookup(User)
  • A Trigger to copy OwnerId to Owner_Link__c when the Owner is changed
  • A test for the Trigger

Trigger:

trigger Update_OwnerLink_on_Owner_Update on Opportunity (before update, before insert) {

  // When 'Owner' field is changed, update 'OwnerLink' too

	// Loop through the incoming records
	for (Opportunity o : Trigger.new) {

		// Has Owner chagned?
		if (o.OwnerID != o.Owner_Link__c) {
			o.Owner_Link__c = o.OwnerId;
		}
	}
}

Test:

public with sharing class TriggerTest_OwnerLink {

	static TestMethod void testOwnerLink() {

		// Grab two Users
		User[] users = [select Id from User limit 2];
		User u1 = users[0];
		User u2 = users[1];

		// Create an Opportunity
		System.debug('Creating Opportunity');
		Opportunity o1 = new Opportunity(CloseDate = Date.newInstance(2008, 01, 01), Name = 'Test Opportunity', StageName = 'New', OwnerId = u1.Id);
		insert o1;

		// Test: Owner_Link should be set to user 1
		Opportunity o2 = [select id, OwnerId, Owner_Link__c from Opportunity where Id = :o1.Id];
		System.assertEquals(u1.Id, o2.OwnerId);
		System.assertEquals(u1.Id, o2.Owner_Link__c);

		// Modify Owner
		o2.OwnerId = u2.Id;
		update o2;

		// Test: Owner_Link should be set to user 2
		Opportunity o3 = [select id, OwnerId, Owner_Link__c from Opportunity where Id = :o2.Id];
		System.assertEquals(u2.Id, o3.OwnerId);
		System.assertEquals(u2.Id, o3.Owner_Link__c);
	}
}

This then gave me a new Owner object on my Opportunity on which I could create Formulas:

Result

I could also use it in the DataLoader by referring to Owner_Link__r.Alias.

Hooray!

Easy to solve, but it’s a shame it was necessary.

The Bottom Line

  • Formulas can’t access Opportunity.Owner fields
  • Create a ‘shadow’ field to hold Owner and populate it via a Trigger
Tags: Apex

A new Salesforce blog post by Ümit Yalçinalp (whom I interviewed back at Dreamforce 2009) is showing some incredible new features for the System Log Console.

I first discovered the joys of the Console while trying to debug some Apex. I used it to call an Anonymous function to run my code, or sometimes I’d just paste my whole code into the debug window and run it from there. It’s also a great place to run some quick code to update records.

However, it’s not that great. It’s very hard to do good debugging in a web-based, multi-tenant environment. Some recent improvements were quite nice but they look decidedly antiquated compared to what’s going to be released.

It seems that we’ll be able to step-through already-executed code. So, rather than pausing the whole Salesforce system, waiting for us to hit a ‘step’ button (not great for a multi-tenant system!), it captures very detailed information and allows us to play back the executed code while viewing the execution stack and the lines of Apex code.

Very clever indeed!

The Bottom Line

  • A new System Log Console is being piloted
  • It’s giving amazing capabilities for a cloud-based system
  • See the full article for more details
Tags: Apex

Wow, I just noticed the new System Log window in Salesforce.com:

The first big improvement is the provision of a Log History. When executing code in the top half, the log file appears in the bottom half. This also works when data is updated through other means, such as editing a record. Previous logs can be viewed by clicking. (I did, however, notice that it was a little slow when the log was large.)

Another little improvement is the way the “Execute” button changes to “Executing…” when clicked. I have always been confused over this, because the window appears to ‘hang’ when executing time-intensive code (eg looping over many records) and I could never tell if the command was correctly transmitted.

Thank you, Salesforce development team!

The Bottom Line

  • The System Log window has been updated
  • Multiple log histories are available
Tags: Apex, SOQL

Here’s a lesson I learned while making our Cloud Developer Challenge entry.

Take a look at these lines of code and tell me which ones are OK and which ones are dangerous:

integer count = [SELECT count() FROM Contact];
Contact c1 = [SELECT Id FROM Contact LIMIT 1];
Contact c2 = [SELECT Id FROM Contact LIMIT 1];
Id first = [SELECT Id FROM Contact LIMIT 1].Id;
Contact[] contacts = [SELECT Id FROM Contact LIMIT 1];
Contact[] allContacts = [SELECT Id FROM Contact];

If you executed the above code on most systems, it will run just fine. However, there is a situation in which you’ll get the dreaded List has no rows for assignment to SObject error. Can you figure out which one?

It’s when there’s no objects to return!

For example, if you have a system with no Contracts, try running the above code with Contract in place of Contact and you’ll get:

ERROR - Evaluation error: System.QueryException: List has no rows for assignment to SObject
ERROR - Evaluation error: AnonymousBlock: line 2, column 15

What happened? Well, these 3 lines will all give an error if there’s no row found:

Contact c1 = [SELECT Id FROM Contact LIMIT 1];
Contact c2 = [SELECT Id FROM Contact LIMIT 1];
Id first = [SELECT Id FROM Contact LIMIT 1].Id;

While a SELECT normally returns an array/list, these statements are using the shorthand syntax that assumes only one row is returned. What’s not obvious is that it also assumes that exactly one row is returned!

While this is unlikely to occur for Contact, it is highly likely to occur for any custom objects you create, especially when a WHERE statement is used that might return zero rows, such as:

Player__c player = [SELECT Id from Player__c where Name = :username];
if (player != null)
    p = player.Id;

The above code will fail if there is no Player__c record with the matching username. It doesn’t actually return a null.

It would be safer to do the following:

Player__c[] players = [SELECT Id from Player__c where Name = :username];
if (players.size() > 0)
    p = players[0].Id;

It’s one of those situations for which you wouldn’t normally think of creating a test, so it’s safer to just avoid the possibility rather than making an assumption about your data.

The Bottom Line

  • The shorthand syntax that expects one row to be returned from a SELECT statement is very handy
  • It’s also dangerous if no rows are found!

I’m pleased to announce my entry into the 2009 Force.com Cloud Developer Challenge .

The Challenge is designed to evangelize the Force.com platform, encouraging lots of people to discover, learn and use Force.com. The rules are wide open — just build something using Force.com, Visualforce and Sites (so that it is publicly accessible). It’s a very clever idea — sort of a small version of the X-Prize , with the concept that offering prizes will encourage more people to do interesting than actually directly paying people to do it!

Our Entry

So, would you like to know about our entry?

Yes, I use the word ‘Our’ rather than ‘My’ because I’m happy to say that I teamed up with David Schach, author of the X-Squared on Demand blog . Previous readers will remember that I met David some time ago when he visited Australia . Well, it just so happened that David read my previous blog entry about the Cloud Developer Challenge and dropped me an email to say he was visiting Australia again, and did I want some help?

This was a god-send, because I had been hitting lots of brick walls in my ramp-up of Visualforce and Sites knowledge, and David is an absolute expert in the subject. So, I did all the high-level UI, he did all the low-level ‘engine room’ stuff and we worked in the middle to make a very exciting site.

Introducing Daily Shinro

The site we developed is a Social Gaming Website that we call Daily Shinro .

Daily Shinro

The idea for the site began with my work team at Atlassian , where we play a daily game of SET , a really fun logic puzzle that changes each day. To keep track of our scores, we created a shared Google Apps spreadsheet with our scores and we soon discovered that it was actually just as fun to analyze the scores as to play the game!

We had been looking around for another daily puzzle that we could all play, preferably something that only took a couple of minutes and which had a scoring element. Unfortunately, very few online games took our interest. However, around that time, I had become addicted to playing Shinro Mines on the iPhone. Shinro is a little-known puzzle that has been described as a cross between Minesweeper and Sudoku.

So, once I put together the desire to enter the Cloud Developer Challenge, the need for another team puzzle with a social scoring element and my enjoyment of Shinro Mines, the choice of project became obvious!

A Social Gaming Website

There’s really two parts to Daily Shinro: the game and the social aspect.

The Game
I created the game in JavaScript using the fantastic open-source Raphael JavaScript graphics library that provides cross-browser graphics. It renders in XML for IE (boo!) and SVG for every other browser (yay!). It even supports animation.

Raphael was actually written by Dmitry Baranovskiy , a colleague of mine at Altassian. I highly recommend you look at the demo pages!

To provide a ‘daily’ concept for the game, a new puzzle is made available each day. The background picture for the puzzle is a daily selection from flickr, selected with the help of the beautiful Idée Multicolr Search Lab . It just adds a bit of spice and variety to the daily puzzle!

Social Gaming
From the very beginning, my intention was to create a ‘social’ gaming experience, based upon the spreadsheet developed by my work team. Basically, scores are calculated by how well each player beats the team average. Therefore, it needs at least two people to play the puzzle and all scores net-out to zero. This has the advantage that, if somebody doesn’t play one day, they aren’t penalized.

This scoring system is implemented at two levels in Daily Shinro — Public and ‘League’.

At the Public level, players’ scores are compared to the public average. At the League level, scores are compared only amongst your friends or office co-workers who are members of the same League.

Leagues are all about comparisons within a team rather than between teams — sort of like a private gaming site for friends. Thus, everyone can create their own League and invite friends to join. The score graphs then reflect the scores amongst members of the League.

League Score sample

The charts are generated with some magical Apex code that gathers up League scores, compares players to game averages and then outputs the results into a Google Charts URL.

I invite you all to visit www.dailyshinro.com and play the game!

If you find any bugs or strange behaviors, please let us know!

Will we win?

Ah, that’s the big question! My personal goal is to have the site ‘mentioned’ in the results pages of the Cloud Developer Challenge. That way, we’ll feel rewarded for the long hours that were put into the site.

We’ve done several things to give us an advantage in the competition:

  • We made a fun site that is attractive and enjoyable to use.
  • We utilized lots of different technologies to showcase how Force.com can ‘pull together’ capabilities from across the web to build an even richer application environment.
  • We added lots of pages of explanations , showing how we used Force.com technology. I’m hoping that this will help the folks at Salesforce.com promote the Force.com platform, which means that they’ll want lots of people visiting our site to help promote their technology. And what better way to do this than mentioning us in the final results?! (Clever, eh!)
  • Finally, there’s this blog, which I know is read by about 1000 people per month (including some legendary Salesforce staff!). Throw in David’s blog and we’re hoping to get the attention of enough people in Salesforce that we’ll get on the short-list.

Oh, one word to the judges — we fixed a few bugs in the last couple of days. So, if you visited the site already, please visit again and see it in its fully-working glory!

It’s been a tough Challenge, but it was great fun and an utterly fantastic way to learn Visualforce and Sites. I started with zero knowledge and now consider myself a competent user of those technologies. I’d also like to thank David Schach whose knowledge of Visualforce and Sites is just incredible. I couldn’t have done this site without him.

The Bottom Line

  • Visualforce and Sites are mature and capable technologies for building websites on Force.com technology
  • The Cloud Developer Challenge was an excellent way to ‘spread the word’ and get people to use the technologies
  • You’ve got to visit Daily Shinro!
  • If you know the judges, please tell them how great we are! :)