I would think processing this sort of filtering on 12000 records is unfortunately going to be slow. The bloat in this process is dependent on how many entries in your
rateSheetSummaries
collection. If you have one entry in there, then you're looking at 1x12000 searches; otherwise, if you have 100 then this expands greatly up to 1,200,000 searches. Even worse is that every one of these searches performs the following:
d.Description
.ToLower()
.Trim()
.Contains(rateSheetSummary
.DestinationLookup
.ToLower()
.Trim())
That's a lot of operations to be doing on every single item. Instead, think about ways to optimise this particular operation. First, the
Contains()
method has an overload for whether to be case insensitive. Second, consider whether using
Trim()
on every
d.Description
is needed. Third, you could cache the
DestinationLookup
prior to the search. Fourth, the call to
ToList()
on the results of the search seems unnecessary. Something like:
foreach (var rateSheetSummary in rateSheetSummaries)
{
string search = rateSheetSummary.DestinationLookup.Trim();
IEnumerable<Destination> insertDestinations = destinations
.Where(d => d.Description
.Contains(search, StringComparison.InvariantCultureIgnoreCase));
foreach (var destination in insertDestinations)
{
Rate rate = new Rate()
{
DestinationGroup = rateSheetSummary.Description,
Description = rateSheetSummary.Description,
DestinationId = destination.DestinationId,
Country = rateSheetSummary.CountryCode,
TariffId = tariffId,
NextPrice = rateSheetSummary.Peak,
Price = rateSheetSummary.Peak,
Interval = 1,
Discontinued = "N",
Forbidden = "N"
};
rates.Add(rate);
}
}
At the end of the day though, this might only provide a minor improvement to performance. The main issue here is iterating over so many records.