Older results will still show up, but these search engines are no longer able to “crawl” Reddit, meaning that Google is the only search engine that will turn up results from Reddit going forward.
Robots.txt lets you ask specific user-agents not to index the site. My guess is that that’s how they restricted it. I don’t know how those changes are reflected in existing indexed pages – don’t know if there’s any standard there – but it’ll stop crawlers from downloading new pages.
Try searching for new posts, see how DDG/Bing compares to Google.
EDIT: Yeah. They’ve got a sitewide ban for all crawlers. That’d normally block Google’s bot too, but I bet that they have some offline agreement to have it ignore the thing, operate out-of-spec.