"The fastest code is the code that is never executed."
If you are working on something you originally thought would be easy but has ended up being many more lines of code than you originally intended, it is quite possible that your code is getting a little bloated, and more than a little slow. Hoare's law tells us "inside every large program is a small program struggling to get out", so you should consider chopping out blocks of code that are outdated, outmoded, replaced, or irrelevant.
I've quoted from it already, but Eric Raymond's "The Art of Unix Programming" is just so damn right about this sort of thing that I can't help but quote from it again. Raymond says that, "the most powerful optimisation tool in existence may be the delete key", and even quotes Ken Thompson (one of the most highly respected Unix hackers in the world) as saying, "One of my most productive days was throwing away 1000 lines of code."
This is very much down to personal intuition, and is often very hard to do at first. However, try this kind of thing out to give you an idea of script performance:
This allows you to time the execution of your script, or at least certain parts of your script. If you see something running particularly slowly, it may be because your implementation is bad, or because your actual algorithm is faulty.
When dealing with performance benchmarking, however, you need to be wary - it has been said that "premature optimisation is the root of all evil". That is, there are many optimisations that can be implemented to make your code run faster/smoother, however most of them also make the code harder to read and/or edit.
Find a happy medium - test your scripts now and again, and note where things are running poorly. However, hold your optimisation off as long as possible.