In a recent PyTorch practice, I used the torch.nn.Parameter() class to create a module parameter but found the parameter was initialized with diminutive values like 1.4013e-45, which brought about very strange returned results. I replaced torch.nn.Parameter() with torch.nn.Linear() later, and surprisingly found the initialized values not odd anymore and the returned results reasonable.

I want to find an explanation for this. What do nn.Parameter() and nn.Linear() accurately do after being called? And what are the differences between them, and furthermore, nn.Embedding(), these frequently-used parameter building modules?


weight = torch.nn.Parameter(torch.FloatTensor(2,2))

This code above shows an example of how to use nn.Parameter()

— — —

Temporal Knowledge Graph

The temporal knowledge graph is a special type of knowledge graph. Usually, knowledge graph data is presented in the form of triples, which can be denoted as (h, r, t) where h, r and t are the abbreviations of head entity, relation, and tail entity respectively. In temporal knowledge graph, in addition to entities and relations, the temporal attribute of relations is included as well. Temporal knowledge graph data is presented in the form of quadruples, which are denoted as (h, r, t, a_t) where a_t is the temporal attribute on relations. For instance, (Trump, is_president_of, US, 2017–2020).

Representation learning…


I study on knowledge graph representation learning and love to share :)

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store