Background/introduction: Pre-trained transformer models shine in many Natural Language Processing tasks and therefore are expected to bear the representation of the input sentence or text meaning. These sentence-level embeddings are also important in retrieval-augmented generation. But