I was puzzled recently by a cookie problem that can only reproduced on IE and cannot be seen on Firefox and Chrome. The cookies generated by a testing server were set on Firefox and Chrome, but were rejected by IE. This resulted in that a user cannot access any protected resource right after logged in. After struggling with both the server and the client for abouting three hours, I read the messages line by line with wide open eyes and found a fresh cookie was already expired when it was sent to a client. The root reason was the test server's system time is about 24 hours behind.
The browsers I used to test are all the latest stable versions, but I assume it reflects how these three browsers check whether a cookie is expired. Since the expiration date/time in the Set-Cookie header is indeed expired when the client sees it in the response, IE's approach seems valid at the first sight. This approach is also straightforward to be implemented.
We all know that any system's clock is inaccurate, and the clocks of a server and a client are not synchronized in most cases. If the clocks are not too off the "true" time, and the cookie's age is not too short, and the cookie delivery latency is not too long, the approach of judging expiration by Expires header should work fine. However, what if these assumptions are wrong?
In such cases, Firefox and Chrome use more information to tell if a cookie is expired. The other piece of information is the Date header. With both Date and Expires headers, the client interprets how long the cookie can be valid from the local "now" on. I think this is a more realistic approach to handle the problem in a distributed environment.