When you're working with regular languages specified in regular expression form, there's a really cool idea that you can use for building regular expression matchers, and for describing how to convert from a regular expression to a NFA. It's called the Brzozozwksi derivative of a regular expression - or just simply the derivative of a regexp.
The basic idea of the derivative is that given a regular expression, \(r\), you can derive a new regular expression called the derivative with respect to symbol \(c\), \(D_c(r)\). \(D_c(r)\) is a regular expression describing the string matched by \(r\) after it's matched an \(r\).
Continue Reading »
I just recently realized that I only wrote about computability back in the earliest days of this blog. Those posts have never been re-run, and they only exist back on the original blogger site. When I wrote them, I was very new to blogging - looking back, I think I can do a much better job now. So I'm going to re-do that topic. This isn't just going to be a re-post of those early articles, but a complete rewrite.
The way that I'm going to cover this is loosely based on the way that it was first taught to me by a wonderful professor, Dr. Eric Allender at Rutgers, where I went to college. Dr. Allender was a really tremendous professor: he managed to take an area of computer science that could seem hopelessly abstract and abstruse, and turned it into something fun and even exciting to learn about.
Computability is the most basic and fundamental sub-field of theoretical computer science. It's the study of what a mechanical computing device can do. Not just what a specific mechanical computing device can do, but what can any mechanical computing device do? What are the limits of what you can do mechanically? And once we know the limits, what can we discover about the nature of computation?
Continue Reading »