There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

RossGayler ,
@RossGayler@aus.social avatar

@cian @cogsci
Just off the top of my head (speculation alert)

"1) assume we have a limited storage capacity so aren't good at raw memorisation"
Maybe everything gets encoded and stored, but we aren't so good at retrieval/recall of specific episodes.
Maybe that poor exact episodic retrieval is a consequence of generalisation at retrieval.

"2) even after learning something, we tend to forget details over time."
Assuming we store new episodic memories over time, the accumulation of new memories might make it harder to retrieve specific old episodes through a generalisation at retrieval mechanism. (Also, even if parts of old episodes were to randomly disappear over time, that wouldn't necessarily stop a generalisation at retrieval mechanism. A good generalisation mechanism should be able to cope with partial records of episodes.)

"Neither of these really apply to ANNs?"
Well, you could include weight decay in an ANN and there is the phenomenon of catastrophic forgetting. However, I take the relevance of most current ANNs (feedforward, weight optimising networks) to cognitive concerns with a fairly large pinch of salt.
IMO the theoretical conceptual framework of most current ANNs doesn't make contact with cognitive concerns so you can't really ask these questions of them.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • lifeLocal
  • goranko
  • All magazines