AddThis Feed Button

Recent Items

Tags: Apex, SOQL

I had to transfer opportunities and tasks between users today.

I noticed that the “Mass Transfer Records” option didn’t let me do it. I also found a couple of AppExchange apps that offered to do it, but I hate the hassle of installing foreign code into my own system.

So, I just did it myself via the System Log. I’m an old-fashioned sort of guy, so I even did it via the “old” version of the System Log. In case it’s useful for other people, here’s my code:

Opportunity[] opps = [select Id from Opportunity where OwnerId = '005200000010rF5' and StageName = 'Open'];
for (Opportunity o : opps) {
  o.OwnerId = '00520000003EtsZ';
update opps;

Of course, you’ll probably have different criteria than this, but it’s pretty straight-forward.

Here’s how I transferred opportunities that were ‘Not Started’:

Task[] tasks = [select Id from Task where OwnerId = '005200000010rF5' and Status = 'Not Started'];
for (Task t : tasks) {
  t.OwnerId = '00520000003EtsZ';
update tasks;

The Bottom Line

  • You can’t transfer Opportunities nor Tasks using in-built tools
  • You can do it easily via some quick Anonymous Apex in the System Log
  • I’m an old-fashioned kinda boy
Tags: Data Loader, SOQL

Our Salesforce instance loads quite a bit of data via the Data Loader (so much, that it sometimes screws things up). This is done in automated batches via the command line. Better yet, the information is extracted directly from our JDBC database — no disk files are involved! To top it off, it even runs on a Linux system (which was a little difficult because Data Loader is only distributed as a Windows executable!).

We also export some data from Salesforce, again inserting it directly into a database.

My challenge for today was to extract some information from the Opportunity Stage History (kept in the OpportunityHistory object). Specifically, I wanted to extract some Opportunity data together with the date that the Opportunity’s Stage was set to ‘Lost’. This required ‘joining’ between the Opportunity and OpportunityHistory objects.

I referred to the trusty SOQL Relationship Queries documentation and wound up writing a query on the OpportunityHistory object that also included data from the ‘parent’ Opportunity object:

  Opportunity.Reason_Lost__c, -- My custom fields
  CreatedDate    -- Date on the Stage History record
from OpportunityHistory
  StageName = 'Lost'  -- On OpportunityHistory
  and Opportunity.Reason_lost__c != ''
  and Opportunity.CreatedDate >= 2009-11-01T00:00:00Z  -- I always hate having to write dates like this!

This very nicely returned me rows from Opportunity objects together with the date that the Stage was changed to ‘Lost’.

However, I had a lot of trouble getting it to load, and then I realised that my SDL file (used to map fields) also had to have the ‘Opportunity.‘ prefix!

In export_map_reason_lost.sdl


At this point, I should give a shout-out to my favourite Salesforce employee, Simon Fell (see my interview with him at Dreamforce 2009). He has written a great Mac utility called Soql Xplorer that makes writing SOQL a snap!

To assist any readers who are trying to nut-out how to use Data Loader to push/pull data to/from a database, here’s the relevant entries in my XML files:

In process-conf.xml:

    <bean id="extractReasonLost"
        <description>Extract Lost Opportunities to discover Reason Lost</description>
        <property name="name" value="extractReasonLostName"/>
        <property name="configOverrideMap">
                <entry key="sfdc.endpoint" value=""/>
                <entry key="sfdc.username" value=""/>
                <entry key="sfdc.password" value="secret"/>
                <entry key="sfdc.timeoutSecs" value="600"/>
                <entry key="sfdc.loadBatchSize" value="100"/>
                <entry key="sfdc.entity" value="Opportunity"/>
                <entry key="process.enableLastRunOutput" value="false" />
                <entry key="sfdc.extractionRequestSize" value="500"/>
                <entry key="sfdc.extractionSOQL" value="select Opportunity.Reason_Lost__c, Opportunity.Expiry_date__c,
                                 Opportunity.Product_Family__c, CreatedDate
                                 from OpportunityHistory where StageName = 'Lost' and Opportunity.Reason_lost__c != ''
                                 and Opportunity.CreatedDate >= 2009-11-01T00:00:00Z"/>
                <entry key="process.operation" value="extract"/>
                <entry key="process.mappingFile" value="export_map_reason_lost.sdl"/>
                <entry key="dataAccess.type" value="databaseWrite"/>
                <entry key="" value="extractReasonLostBean"/>

In database-conf.xml:

<bean id="extractReasonLostBean"
    <property name="sqlConfig"  ref="extractReasonLostQuery"/>
    <property name="dataSource" ref="server_name"/>

<bean id="extractReasonLostQuery"
      class="com.salesforce.dataloader.dao.database.SqlConfig" singleton="true">
    <property name="sqlString">
            INSERT INTO renewals_lost (
               period, reason_lost, expiry_date, product, lost_date)
            VALUES (@period@::numeric, @reason_lost@, @expiry_date@, @product@, @lost_date@ )
    <property name="sqlParams">
            <entry key="period"           value="integer"/>
            <entry key="reason_lost"      value="java.lang.String"/>
            <entry key="expiry_date"      value="java.sql.Date"/>
            <entry key="product"          value="java.lang.String"/>
            <entry key="lost_date"        value="java.sql.Date"/>

The Bottom Line

  • Data Loader is very powerful for importing and exporting directly to/from a database
  • When extracting via SOQL that involves a relationship, include the fully qualified name in the SDL file (eg Opportunity.Stage)
  • Simon Fell is my hero!

I received a notice from my friendly Salesforce rep recently, advising that I had gone over my storage limit:

The last time I had heard from Salesforce on such a matter was when Chatter went wild and took me to 493% of my storage allocation! Oh, you’ll also notice from the picture in that article how much my ‘Contact’ record storage had grown over the past year!

This time, my rep kindly offered to raise an invoice for the additional storage space. I’m cheap at heart, so I decided instead to reduce my storage space. Not that I’m upset at Salesforce — I know it’s expensive to store data in their system because it’s all replicated between data centers, backed-up, etc. However, I knew that a lot of my data was unnecessary, and I could just dump it.

To explain, I populate my Salesforce instance from an external system. I had over 220,000 Contact records, of which only a subset were required. So, I decided to remove Contact records:

  • For people who don’t own any of our products (defined in a custom field)
  • For records with no Activities

So, I ran Data Loader (actually, the Mac version which is LexiLoader, compliments of Simon Fell, who reminds people to vote for his Idea that Salesforce produce an official Mac version) and extracted a list of contacts who don’t own a product.

I then ran another Data Loader extract to get a list of all Activity records.

Next, took the first list of contacts and subtracted any contacts associated with the Activity records. (I couldn’t figure out how to do this in one SOQL statement, suggestions welcome!)

Finally, I took the list of record IDs and asked the Data Loader to do a bulk delete of the records. It took my storage way down:

I must say, the bulk delete operation was extremely fast, since the Data Loader uses the Bulk API for such operations.

The ‘Oops!’ moment

Things seemed fine until a couple of days later when my users reported that they had records with Activities that had been deleted. I went back and checked my work, only to discover that I made an error in my “subtraction” step. Instead of taking all contacts and removed all IDs that matched a list of contacts that had Activities, I subtracted the list of Activities themselves. Since these objects have non-overlapping Ids (that is, no Activity IDs matched any Contact IDs), that operation did nothing.

End result: I deleted a lot of useful records. Gulp!

I did some searching and found rumors that Salesforce could undelete records, but charge a lot of money for the privilege. Not great, since it would cost more than I had originally tried to save!

Next, I investigated the Recycle Bin. Here’s what the official documentation says:

The Recycle Bin link in the sidebar lets you view and restore recently deleted records for 30 days before they are permanently deleted. Your recycle bin record limit is 250 times the Megabytes (MBs) in your storage. For example, if your organization has 1 GB of storage then your limit is 250 times 1000 MB or 250,000 records. If your organization reaches its Recycle Bin limit, Salesforce automatically removes the oldest records if they have been in the Recycle Bin for at least two hours.

My limit actually is 1GB (because we only have a small number of users, so we get the minimum size). Therefore, I get 250,000 records. Given that I deleted about 220,000 records, it means they’re all still in there!

I started to use the Recycle Bin ‘undelete’ function, but doing 200 at a time means I’d need to do it 1000 times!

So, I next tried some Apex in the System Log window, like this:

Contact[] c = [select id from contact where isDeleted = true LIMIT 1000 ALL ROWS];
undelete c;

However, some records didn’t want to undelete because our external system had already Upserted replacements and undeleting some records would have caused a clash of unique fields. And if this happened, the whole undelete was rolled-back rather than allowing through the non-clashing records. Argh! So, I then went to something a bit more sophisticated:

// Get a list of Contact records to delete
Contact[] contacts = [select id, EmailAddr__c from contact where isDeleted = true limit 1000 ALL ROWS ];

// Put the Email addresses into an array
String[] emails = new String[]{};
for (Contact c : contacts) {

// Get a list of 'alive' Contacts (not deleted) that already use that email address
Contact[] alive = [select id, EmailAddr__c from contact where EmailAddr__c in :emails];
system.debug('Found: ' + alive.size());

// Make a list of Contacts to delete
if (alive.size() != 0) {
  for (Contact c : alive) {
    for (Integer  i = 0; i < contacts.size(); ++i) {
      if (contacts[i].EmailAddr__c == c.EmailAddr__c) {
  system.debug('Will undelete: ' + contacts.size());

  // Delete them!
  undelete contacts;

I should explain the EmailAddr__c thing. You see, Email is my external ID. However, I couldn’t use the standard Email field as an External ID because I can’t force it to be unique. So, I have a second field for Email address and I populate the both. For more details, see my earlier blog post.

Anyway, the above code took about 2 minutes for 1000 records:

10:11:19.031 (31752000)|EXECUTION_STARTED
10:11:19.031 (31788000)|CODE_UNIT_STARTED|[EXTERNAL]|execute_anonymous_apex
10:11:19.032 (32365000)|SOQL_EXECUTE_BEGIN|[1]|Aggregations:0|select ...
10:11:19.074 (74698000)|SOQL_EXECUTE_END|[1]|Rows:1000
10:11:19.202 (202887000)|SOQL_EXECUTE_BEGIN|[6]|Aggregations:0|select ...
10:13:07.266 (108266842000)|SOQL_EXECUTE_END|[6]|Rows:157
10:13:07.267 (108267315000)|USER_DEBUG|[7]|DEBUG|Found: 157
10:13:15.949 (116949306000)|USER_DEBUG|[19]|DEBUG|Will delete: 896
10:13:15.950 (116950156000)|DML_BEGIN|[20]|Op:Undelete|Type:Contact|Rows:896
10:13:19.937 (120937987000)|DML_END|[20]

Most of the time taken was for the 2nd SOQL query (106 seconds), which matches on email. The loop to eliminate duplicates also took time (8 seconds). The undelete itself was relatively quick (4 seconds).

So, I included an ORDER BY clause in my initial query that tried older records first. This resulted in less email clashes, and much faster execution times.

Over the course of a day, I managed to undelete all the records. In fact, it sped up a lot after midnight San Francisco time (which is easy for me because I’m in Australia). Finally, I did my mass delete properly and everybody was happy.

The result:

How to avoid this error in future

Okay, I was doing dangerous stuff and I did it wrong. So how could I avoid this in future? Some ideas:

  • Make a backup first! Extract all data first (but that’s not easy!) or use the “Export Data” function (but that’s not easy to reload).
  • Try it in the Sandbox first. However, we have a Cofiguration-only Sandbox, without all the data. No good.
  • Test before committing the delete. I did pick random records, but obviously not enough.
  • Get somebody else to review my work before deleting.

The last idea reminds me of a quote in Kernighan’s famous book The Practice of Programming:

Another effective technique is to explain your code to someone else. This will often cause you to explain the bug to yourself. Sometimes it takes no more than a few sentences, followed by an embarrassed “Never mind, I see what’s wrong. Sorry to bother you.” This works remarkably well; you can even use non-programmers as listeners. One university computer center kept a teddy bear near the help desk. Students with mysterious bugs were required to explain them to the bear before they could speak to a human counselor.

I used that technique a lot at work. I ask somebody to “be my teddy bear”, tell them my problem, suddenly realize the solution, then thank them for their help even though they said nothing. Works every time!


Oh, here’s some irony. No sooner did I do the above, then I receive an email from Salesforce telling me that Recycle Bin limits are being cut:

Dear John,

At, Trust is our top priority, and it is our goal to improve the performance of our Recycle Bin functionality. With that in mind, we are making some changes to the Recycle Bin limits to provide you with a faster user experience.

What is the change and how does it impact me?
We are lowering the Recycle Bin retention period from 30 days to 15 days. The Recycle Bin link in the sidebar will now let you restore recently deleted records for 15 days before they are permanently deleted.

Additionally, we are lowering the Recycle Bin record limit from 250 times your storage to 25 times your storage. For example, if your organization has 1 GB of storage then your limit is 25 times 1000 MB or 25,000 records. If your organization reaches its Recycle Bin limit, Salesforce will automatically remove the oldest records if they have been in the Recycle Bin for at least two hours.

When is this change taking effect?
The lower Recycle Bin retention period will go into effect with the Winter ’12 Release.

The irony is that, had these reduced limits been in place, I would not have been able to recover my deleted data. Phew!

The Bottom Line

  • Test or verify before committing large data-related changes
  • You can’t do undelete via the Bulk API
  • The recycle bin is very big!
  • I’m cheap at heart
Tags: Apex, SOQL

We recently reassigned a heap of Opportunities between staff members. However, the previous Opportunity Owner had open Activities on the Opportunities. The new Opportunity owners couldn’t close those Activities since they belong to somebody else.

So, they asked me to find a way to bulk-close the Activities.

This sounded simple, but was made more difficult by the fact that Activities can link to many different object types: Account, Opportunity, Campaign, Case or Custom Object.

The connection is made via WhatID, which is the ID of the associated object. It can be accessed via SOQL like this:

SELECT Id, What.Name from Task

However, not all fields are available, so you can’t SELECT What.OwnerId.

Fortunately, I found a forum post called Getting Object type of WhatId/WhoId Task/Event fields gave me a few hints, and I came up with this code:

// Get list of Opportunities owned by new person
Opportunity[] opps = [Select Id from Opportunity where OwnerId = '005200300014jeN'];

// Get incomplete Activities owned by previous person attached to the above Opportunities
Task[] tasks = [select Id, What.Name from Task where OwnerId = '00520000000tSOj' and Status != 'Completed' and WhatId in :opps];
for (Task t : tasks) { 
  t.Status = 'Completed';
update tasks;

This grabs a list of ‘owned’ Opportunities and checks for any Activities (which are actually Task objects) that have a WhatId matching those Opportunities.

Straight-forward and pretty simple. Almost makes up for not being able to traverse directly to the linked object.

The Bottom Line

  • Activities can link to multiple objects
  • They connect via WhatId, but only a limited number of fields are exposed, eg What.Name
  • Use an ‘IN’ comparison to match the Activities with Opportunities
Tags: Apex, SOQL

I run into this problem all the time.

I want to write a quite routine in the System Log window to Mass Update some records (eg my previous Mass Delete via System Log window blog post). I want to find all records before a certain date, but SOQL never likes my date format, eg:

select Id from Opportunity where Expiry_Date__c < 2010-01-01

Yes, it is possible to convert it to Timezone format and do it this way:

select Id from Opportunity where Expiry_Date__c < 2008-01-01T00:00:00Z

but I’ve always thought that silly when comparing against a Date field.

So, I eventually figured out that I can do it this way:

select Id, Name, LastLoginDate from User where LastLoginDate > :Date.valueOf('2008-01-01')

Of course, this only works within the context of Apex, such as the System Log window. It won’t work in pure SOQL tools like SOQL Explorer. Here’s an example:

Opportunity[] opps = [select Id from Opportunity where Expiry_Date__c < :Date.valueOf('2010-01-01')];

The Bottom Line

  • Date strings can’t be entered into SOQL
  • Option 1: Use Timezone format: 2010-01-01T00:00:00Z
  • Option 2: Convert from string: :Date.valueOf(’2010-01-01′)
Tags: Apex, SOQL

Here’s a lesson I learned while making our Cloud Developer Challenge entry.

Take a look at these lines of code and tell me which ones are OK and which ones are dangerous:

integer count = [SELECT count() FROM Contact];
Contact c1 = [SELECT Id FROM Contact LIMIT 1];
Contact c2 = [SELECT Id FROM Contact LIMIT 1];
Id first = [SELECT Id FROM Contact LIMIT 1].Id;
Contact[] contacts = [SELECT Id FROM Contact LIMIT 1];
Contact[] allContacts = [SELECT Id FROM Contact];

If you executed the above code on most systems, it will run just fine. However, there is a situation in which you’ll get the dreaded List has no rows for assignment to SObject error. Can you figure out which one?

It’s when there’s no objects to return!

For example, if you have a system with no Contracts, try running the above code with Contract in place of Contact and you’ll get:

ERROR - Evaluation error: System.QueryException: List has no rows for assignment to SObject
ERROR - Evaluation error: AnonymousBlock: line 2, column 15

What happened? Well, these 3 lines will all give an error if there’s no row found:

Contact c1 = [SELECT Id FROM Contact LIMIT 1];
Contact c2 = [SELECT Id FROM Contact LIMIT 1];
Id first = [SELECT Id FROM Contact LIMIT 1].Id;

While a SELECT normally returns an array/list, these statements are using the shorthand syntax that assumes only one row is returned. What’s not obvious is that it also assumes that exactly one row is returned!

While this is unlikely to occur for Contact, it is highly likely to occur for any custom objects you create, especially when a WHERE statement is used that might return zero rows, such as:

Player__c player = [SELECT Id from Player__c where Name = :username];
if (player != null)
    p = player.Id;

The above code will fail if there is no Player__c record with the matching username. It doesn’t actually return a null.

It would be safer to do the following:

Player__c[] players = [SELECT Id from Player__c where Name = :username];
if (players.size() > 0)
    p = players[0].Id;

It’s one of those situations for which you wouldn’t normally think of creating a test, so it’s safer to just avoid the possibility rather than making an assumption about your data.

The Bottom Line

  • The shorthand syntax that expects one row to be returned from a SELECT statement is very handy
  • It’s also dangerous if no rows are found!