Close Menu
Daily Guardian
  • Home
  • News
  • Politics
  • Business
  • Entertainment
  • Lifestyle
  • Health
  • Sports
  • Technology
  • Climate
  • Auto
  • Travel
  • Web Stories
What's On

Revisiting the days when the mob controlled the jukeboxes

May 3, 2026

These digital Polaroids are a clever way to cover a fridge in memories

May 3, 2026

Vacancies in Canada’s Senate are growing and some senators are concerned

May 3, 2026

Carney to take part in European summit on security, trade

May 3, 2026

How the internet’s favorite squirrel dad made a hit camera app

May 3, 2026
Facebook X (Twitter) Instagram
Finance Pro
Facebook X (Twitter) Instagram
Daily Guardian
Subscribe
  • Home
  • News
  • Politics
  • Business
  • Entertainment
  • Lifestyle
  • Health
  • Sports
  • Technology
  • Climate
  • Auto
  • Travel
  • Web Stories
Daily Guardian
Home » Arllecta Group announces its new S2S algorithm that significantly outperforms current strong GPT-solutions
Press Release

Arllecta Group announces its new S2S algorithm that significantly outperforms current strong GPT-solutions

By News RoomSeptember 23, 20245 Mins Read
Arllecta Group announces its new S2S algorithm that significantly outperforms current strong GPT-solutions
Share
Facebook Twitter LinkedIn Pinterest Email

SINGAPORE, Sept. 23, 2024 (GLOBE NEWSWIRE) — Arllecta Group approached the initial implementation of the sense-to-sense (S2S) algorithm based on its own mathematical theory Sense Theory specifically designed for the creation of self-identifying AI.

A significant advantage of the S2S architectural solution is the search for meaning both in A4 format data and in ultra-large volumes of cloud storage.

Now we will briefly describe the technological approach used in our solution.

The uniqueness of this algorithm in comparison with the current most advanced GPT models lies in at least four of its characteristic differences.

The first difference is that this algorithm does not require gigabytes of data and billions of normalized parameters to train the algorithm. The architecture of the sense-algorithm has, as a special case, an implementation of a limited Boltzmann machine. However, at the core lies the implementation of the zero-object paradigm as a vector quantity of non-constant length. This approach allows us to move from numerical (statistical) analysis to sense (semantic) analysis, which is extremely important when creating self-learning and adaptive AI. Without the ability to create “new” knowledge, no AI will succeed in self-identification (self-consciousness).

The second difference is that in the implementation of such extremely important AI tasks for the result as clustering and data typing, we use two innovative tools – the Neuro-Amorphic Function (NAF) and Sense Diagrams. NAF allows us to extrapolate many physical phenomena in nature and very well describes the distribution of senses (meanings) in the text received as input. A sense diagram, created by analogy with Bohr’s atomic theory, very accurately describes large groups of sense (semantic) sets made up of input data, as well as which is extremely important, semantic connections between each element of these sets. It is worth noting that the computational time required to construct and analyze such sets is polynomial.

The third difference is that the sense-algorithm allows us to meaningfully connect objects of different nature from a large volume of data. For example, if there are 1,000,000 sets classified according to any criterion and having different origins of their elements, the sense-algorithm allows one to find a semantic connection between any two objects of these sets.

The fourth and perhaps one of the most important differences is that we do not use the transformer architecture at all, since we believe that this architecture has several critical defects that greatly distort the meaning of the processed input text. Two obvious defects are that, firstly, the combinational law for the scalar product of a query vector and a key vector cannot be satisfied in the attention mechanism, secondly, normalizing the values ​​of the output vector of the attention mechanism introduces distortions in the decoder module, since any type of context vector, when normalized, loses the main thing – the semantic connection between its elements.

Now we will briefly describe the AI focus in our solution.

According to the author of the mathematical theory Sense Theory and the creator of 25 main software modules that make up the software core of S2Schat, Egger Mielberg, the current task of our AI is to obtain only one sentence as the result of an answer when analyzing not only a separate small passage from a book or a specialized article, but also a full-fledged one books of 500 sheets or more.

The main cognitive task of our solution is to implement 2 important directions. The first direction is to search and linguistically describe the zero object of the first level as a vector axis that determines the main meaning of the processed text.

The second direction is to search and describe the depth of the semantic connection between zero objects of the second and other levels. For this implementation, we use Sense Derivatives and the Sense entropy value.

The uniqueness of the meaning of sense entropy lies in the possibility of describing the degree of semantic connections between objects of different natures. This feature is missing in traditional mathematics.

And now we’ll tell you a little about the numbers obtained when comparing our S2Schat solution (13 modules out of 25 implemented) with the most advanced GPT solution now in the market.

GPT-4-S2Schat-1

The numerical values indicated in this article are approximate and may contain inaccuracies due to technical and other reasons.

For our text analysis we used the American Ways book (XX, On Understanding excerpt).

Below are the exact specifications of our solution and approximate GPT’s.

GPT-S2Schat-2

In our algorithmic calculations, we are considering “sense energy”, which has a vector nature and therefore, it will be more accurate to define the Law of Conservation of Sense:

The total sense energy (SE) of any open sense space (OSS) is constant if the conditions of the Mielberg cycle
are satisfied in this sense space.

In other words, to create AI with self-identification, it is extremely important for the implemented algorithm that the meaning (sense) of the analyzed book, article, abstract and other source remains the same.

The law of conservation of sense, like the law of conservation of energy, shows the constancy of the existence of the object of study only in different semantic (energy) forms.

The law of conservation of sense, in contrast to the Turing test, qualitatively determines the degree of “humanity” of digital AI.

Resources:

Sense Theory. Part 1.

https://vixra.org/pdf/1905.0105v1.pdf

S2Schat.

https://www.s2schat.com

Sense Derivative.

https://www.researchgate.net/publication/344876659_Sense_Derivative

Sense Entropy. The Law of Conservation of Sense.

https://www.researchgate.net/publication/369295558_Sense_Entropy_The_Law_of_Conservation_of_Sense

Media Contact:

Contact persons full name: Egger Mielberg

Email id: [email protected]

Company website: https://www.arllecta.com/

Full company name: Arllecta Pte.Ltd

Disclaimer: This content is provided by the sponsor. The statements, views, and opinions expressed in this column are solely those of the content provider. The information shared in this press release is not a solicitation for investment, nor is it intended as investment, financial, or trading advice. It is strongly recommended that you conduct thorough research and consult with a professional financial advisor before making any investment or trading decisions. Please conduct your own research and invest at your own risk.

Photos accompanying this announcement are available at
https://www.globenewswire.com/NewsRoom/AttachmentNg/4019884f-c25b-42ed-bf17-8cacd594b4c9
https://www.globenewswire.com/NewsRoom/AttachmentNg/95acb779-602d-4085-8518-c994055744e7
https://www.globenewswire.com/NewsRoom/AttachmentNg/0cb2aa10-4a0e-472e-9c6f-205a3cb685cb

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Keep Reading

Bath & Body Works launches an out-of-this-galaxy collection to celebrate the release of Star Wars: The Mandalorian and Grogu, only in theaters May 22

BsStrategy Introduces AI-Powered Trading Workspace for Strategy Development and Portfolio Management

Sigenergy Showcases Next-Generation Smart Energy Portfolio at Solar & Storage Live London 2026

Golden Tempo Wins the 152nd Running of the Kentucky Derby Presented by Woodford Reserve

Crypto News Today: AlphaPepe Presale Nears $1.1M Raised Whilst Cardano Price Prediction Targets $5.00

High Point University Celebrates Class of 2026 with Former Walmart International CEO

Steel Power Unveiled: Is SteelPower Male Enhancement Formula Legit? Read Steel Power Supplement Report!

HaloGrow Analyzed: Why Is Halo Grow Hair Growth Spray Trending In The United States?

SynGas OBD Fuel Saver 2026: Claims Examined, Pricing Verified & What Consumers Should Confirm Before Buying

Editors Picks

These digital Polaroids are a clever way to cover a fridge in memories

May 3, 2026

Vacancies in Canada’s Senate are growing and some senators are concerned

May 3, 2026

Carney to take part in European summit on security, trade

May 3, 2026

How the internet’s favorite squirrel dad made a hit camera app

May 3, 2026

Latest News

Bath & Body Works launches an out-of-this-galaxy collection to celebrate the release of Star Wars: The Mandalorian and Grogu, only in theaters May 22

May 3, 2026

BsStrategy Introduces AI-Powered Trading Workspace for Strategy Development and Portfolio Management

May 3, 2026

Sigenergy Showcases Next-Generation Smart Energy Portfolio at Solar & Storage Live London 2026

May 3, 2026
Facebook X (Twitter) Pinterest TikTok Instagram
© 2026 Daily Guardian Canada. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.

Go to mobile version