Performance Improvement Tips for ForEach loop in C#? -
i need optimize below foreach
loop. foreach loop taken more time unique items.
instead can
filteritems
converted list collection. if how it. take unique items it.the problem arises when have 5,00,000 items in filteritems.
please suggest ways optimize below code:
int = 0; list<object> order = new list<object>(); list<object> unique = new list<object>(); // filteritems collection of records. can converted list collection directly, can take unique items it. foreach (record rec in filteritems) { string text = rec.getvalue(“column name”); int position = order.binarysearch(text); if (position < 0) { order.insert(-position - 1, text); unique.add(text); } i++; }
it's unclear mean "converting filteritems list" when don't know it, consider sorting after you've got items, rather go:
var strings = filteritems.select(record => record.getvalue("column name")) .distinct() .orderby(x => x) .tolist();
the use of distinct()
here avoid sorting lots of equal items - looks want distinct items anyway.
if want unique
in original order order
same items, sorted, use:
var unique = filteritems.select(record => record.getvalue("column name")) .distinct() .tolist(); var order = unique.orderby(x => x).tolist();
now distinct()
isn't guaranteed preserve order - in current implementation, , that's natural implementation, too.
Comments
Post a Comment