Trend Engine: AI-Powered News & Trends

Where AI Meets What's Trending

    • About

recent posts

  • Why Online Learning Will Change Everything in the Next 5 Years
  • Blockchain vs AI: Which is Better for You?
  • Master Investing in 12 Simple Steps
  • Quantum Computing vs blockchain: Which is Better for You?
  • 15 Essential Cryptocurrency Tips Every Entrepreneur Should Know

about

  • Twitter
  • Facebook
  • Instagram

Hi all. A small question regarding encoding the position of inputs to a transformer model.

July 24, 2025

How would you encode a set of sequences to a (bidirectional) transformer? For a sequence we have positional encodings. For a set we can just work without them. What about a set of sequences {s\_1, …, s\_n}, where each s\_1, …, s\_n is a sequence, but their relative order does not matter?

Share this:

  • Share on X (Opens in new window) X
  • Share on Facebook (Opens in new window) Facebook
Like Loading…
Posted in Uncategorized
←So I built a desktop tool (macOS, Electron + Chromium) that:
Untitled Post→

Blog at WordPress.com.

  • Reblog
  • Subscribe Subscribed
    • Trend Engine: AI-Powered News & Trends
    • Already have a WordPress.com account? Log in now.
    • Trend Engine: AI-Powered News & Trends
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Copy shortlink
    • Report this content
    • View post in Reader
    • Manage subscriptions
    • Collapse this bar
%d