Corpsey

bandz ahoy
1*WLz7Q7TVOPpPN_51L2gPBw.jpeg
 

Benny Bunter

Well-known member
they're very different. it's a misdiagnosis.
I think there definitely are Prynne fans that are like that, which is why I quoted Douglas Oliver, which I think is very astute. Both extremes get it wrong. Oliver gets what he's doing.

"He is a difficult, often rather obscure poet at a time when it is not very modish to be obscure; once understood, his work can prove rewarding. If one examines the ephemera of newspaper and journal reviews, one finds that his poetry has sometimes been dismissed by reviewers who think that confessing their own lack of understanding permits the arrogance of blind attack. But it has also sometimes been stoutly defended perhaps by those who, understanding perhaps fitfully, have made his difficulty into a virtue; the difficulties in Prynne mostly result from a deliberate choice of writing method in which the emphasis is upon the act of writing itself."
 

luka

Well-known member
Dendrites are appendages that are designed to receive communications from other cells. They resemble a tree-like structure, forming projections that become stimulated by other neurons and conduct the electrochemical charge to the cell body (or, more rarely, directly to the axons).
 

luka

Well-known member
BPNet, or branch-pruned network, is also a GA-based NAS technique [19] for the conditional CNNs, which finds out the optimal locations and thresholds of auxiliary classifiers in the given base CNN for the target platform. It consists of profiling and pruning phases, in addition to training and testing phases. In the training of the BPNet, the candidate auxiliary classifiers, or branches, are attached at conceptually every possible location (i.e., after every convolution layer), and then, in the profiling phase, the softmax value and the latency at each auxiliary classifier are measured and stored in the LUT for the calibration dataset. This LUT is used in the branch pruning phase, where design space for auxiliary classifiers is explored using GA: the location and threshold values of each branch are encoded as a gene and their latency and accuracy scores are evaluated by referencing the LUT. As a result of the branch pruning, only helpful branches, or classifiers, survive and lead to latency reduction while maintaining the accuracy.
 

luka

Well-known member

.5. Latency Prediction of Conditional CNN​

A CNN latency is dependent on the target platform and a conditional CNN latency is further dependent on the input image because it allows early exit at the auxiliary classifier. OFA and BPNet both use LUT-based latency estimation. In OFA, the execution time of different types of layers with various shapes is profiled in advance on the target platform and stored in an LUT, which is then referenced by the latency estimator in the GA to estimate the latency of an arbitrary CNN. Because a CNN layer latency is almost the same regardless of input images, the latency estimator references the LUT and simply adds the latency of each layer in the CNN. In BPNet, the base CNN is fixed and the latency as well as the softmax value at each branch (i.e., auxiliary classifier) in the conditional CNN is profiled and stored in an LUT in advance; then, it is looked up in the GA (i.e., branch pruning stage). However, CondNAS cannot construct the latency LUT of the conditional CNNs in advance as the base CNN is not fixed but is arbitrary, unlike BPNet. Similarly, OFA’s approach of adding the layer latency to estimate the entire network latency is not efficient because, unlike OFA, CondNAS should estimate the conditional CNN latency for all input images as it varies depending on input images. Thus, it must run all input images in the calibration dataset to obtain the mean latency of the conditional CNNs, whereas OFA’s latency estimator for (base) CNNs need only one calculation to estimate the latency as it does not vary depending on the input image. Therefore, we also propose to predict the latency of conditional CNNs with the LightGBM model. Similarly to the accuracy prediction model, the latency prediction model is trained with a number of diverse conditional CNNs with latency labels. Then, it can efficiently predict the mean latency of a given conditional CNN without running any input images during GA. The similar order of time is required for the latency training dataset generation as the accuracy dataset. Unlike the accuracy training dataset, the latency training dataset cannot be reused when the target platform is changed.
 

luka

Well-known member
Convolutional Neural Networks (CNN) are mainly used for image recognition. The fact that the input is assumed to be an image enables an architecture to be created such that certain properties can be encoded into the architecture and reduces the number of parameters required.

The convolution operator is basically a filter that enables complex operations to be performed on an image. Examples are edge detection, gradient recognition and smoothing. This allows pertinent data to be extracted from the image.
 

luka

Well-known member
versus saline via log-rank (Mantel–Cox) test (saline n = 17, EC18 n = 10). (d) Effect of EC18 (10 mg·kg−1, i.p.) on PTZ s.c. induced hindlimb extension seizure latency of the conditional HCN4 knockout 9cHCN4KO mouse 10 min post injection. P = 0.670 versus cHCN4KO + saline via log-rank (Mantel–Cox) test (cHCN4KO) + saline n = 8, cHCN4KO + EC18 n = 7). (e) Electrocorticography (ECoG)
 

sufi

lala
here's an updated version of the script that removes punctuation along with conjunctions, prepositions, and repeated words, and then formats the remaining words into a column with 4-7 words and roughly similar numbers of characters on each line:
makefile
Code:
import string

text = "This is a sample text, and it contains some conjunctions, prepositions, and repeated words such as and, the, and and."

# Remove punctuation from the text
text = text.translate(str.maketrans("", "", string.punctuation))

# Split the text into words
words = text.split()

# Define the list of conjunctions and prepositions to remove
to_remove = ["and", "or", "but", "the", "a", "an", "in", "on", "at", "to", "of", "for", "with", "by", "from"]

# Remove the conjunctions, prepositions, and repeated words
unique_words = []
for word in words:
    if word.lower() not in to_remove and word not in unique_words:
        unique_words.append(word)

# Create a list of lines with 4-7 words each
lines = []
line = ""
for word in unique_words:
    if len(line) + len(word) + 1 > 30 or len(line.split()) >= 7:
        lines.append(line)
        line = ""
    line += word + " "
if line:
    lines.append(line)

# Print the result
for line in lines:
    print(line)[/ICODE]
This script first removes all punctuation from the input text using the translate() method and the string.punctuation constant. Then, it removes conjunctions, prepositions, and repeated words using a loop, as before. Finally, it formats the remaining words into a column with 4-7 words and roughly similar numbers of characters on each line, and prints the resulting lines one by one. Note that this script assumes that words in the input text are separated by spaces, so you may need to modify it if your input text uses a different delimiter.
 

luka

Well-known member
The pipeline that produces the log scaler has a latency of 49 cycles on the Virtex-2 Pro and 43 cycles on the Virtex-6, or 81 cycles and 75 cycles when including the latency of the conditional probability pipeline, from which it receives its inputs.
 

sufi

lala
here's an updated version of the script that removes punctuation along with conjunctions, prepositions, and repeated words, and then formats the remaining words into a column with 4-7 words and roughly similar numbers of characters on each line:
makefile
Code:
import string

text = "This is a sample text, and it contains some conjunctions, prepositions, and repeated words such as and, the, and and."

# Remove punctuation from the text
text = text.translate(str.maketrans("", "", string.punctuation))

# Split the text into words
words = text.split()

# Define the list of conjunctions and prepositions to remove
to_remove = ["and", "or", "but", "the", "a", "an", "in", "on", "at", "to", "of", "for", "with", "by", "from"]

# Remove the conjunctions, prepositions, and repeated words
unique_words = []
for word in words:
    if word.lower() not in to_remove and word not in unique_words:
        unique_words.append(word)

# Create a list of lines with 4-7 words each
lines = []
line = ""
for word in unique_words:
    if len(line) + len(word) + 1 > 30 or len(line.split()) >= 7:
        lines.append(line)
        line = ""
    line += word + " "
if line:
    lines.append(line)

# Print the result
for line in lines:
    print(line)[/ICODE]
This script first removes all punctuation from the input text using the translate() method and the string.punctuation constant. Then, it removes conjunctions, prepositions, and repeated words using a loop, as before. Finally, it formats the remaining words into a column with 4-7 words and roughly similar numbers of characters on each line, and prints the resulting lines one by one. Note that this script assumes that words in the input text are separated by spaces, so you may need to modify it if your input text uses a different delimiter.
I havent been able to run this yet to see how it might perform, but thinking of rules for a Prynne emulator is fun

- This needs stanzas/pages still,
- and also perhaps randomising the word order would be interesting
- maybe make an occasional exception to the deleted conjunctions to spoof some sort of logical order

taking it further, though this would be out of range for a simple python script, but could be done on one of these filthy AIs that have some language recognition abilities
- convert verbs to infinitive
- convert nouns to singular
- change word order to verb-adjective-noun or something, maybe
 

william_kent

Well-known member
I havent been able to run this yet to see how it might perform, but thinking of rules for a Prynne emulator is fun

- This needs stanzas/pages still,
- and also perhaps randomising the word order would be interesting
- maybe make an occasional exception to the deleted conjunctions to spoof some sort of logical order

taking it further, though this would be out of range for a simple python script, but could be done on one of these filthy AIs that have some language recognition abilities
- convert verbs to infinitive
- convert nouns to singular
- change word order to verb-adjective-noun or something, maybe


I tried your script on your post:

This needs stanzaspages still
also perhaps randomising word
order would be interesting
maybe make occasional
exception deleted
conjunctions spoof some sort
logical taking it further
though this out range simple
python script could done one
these filthy AIs that have
language recognition
abilities convert verbs
infinitive nouns singular
change verbadjectivenoun
something
 

Benny Bunter

Well-known member
Here's another quote from Douglas Oliver that I think sheds a lot of light on what Prynne does (referring to Wound Response specifically, but you could apply it to a lot of his other work)

"Prynne's work would reach, if it could, beyond the language condition where sub-microscopic, bio-chemical events are mere metaphor for mental process to a condition where they become more closely a description of that process...

Linking a description of sub-microscopic events to mental events proposes an inner relation that is matched by an outward relation between human mental process and the external world it perceives, where the same sub-microscopic events determine process...

What Prynne's poetry has often sought to do is look at the sub-atomic, instead of the macrocosmic, and see if close analogies can be drawn between such events and the poetic act of mind.

This could get pretentious, but there is always a risk that poets will give up on the kind of task set for them by the great figures such as Dante. Particularly, the birth of the mind act is one of poetry's primary subject matters...

Providing it keeps its humility, poetry can try to reach beyond analogy towards a kind of visionary-literal."
 

sufi

lala
I tried your script on your post:

This needs stanzaspages still
also perhaps randomising word
order would be interesting
maybe make occasional
exception deleted
conjunctions spoof some sort
logical taking it further
though this out range simple
python script could done one
these filthy AIs that have
language recognition
abilities convert verbs
infinitive nouns singular
change verbadjectivenoun
something
not that impressive (due to my original poor prose style)
I wanted to try it on one of the word slabs luka posted above... kinda works better:

BPNet branch-pruned network GA-based
NAS technique optimal locations
thresholds auxiliary classifiers given
base CNN target platform consists
profiling pruning phases addition
training testing candidate attached
conceptually convolution layer softmax
value measured stored LUT calibration
dataset design space explored location
threshold values branch encoded gene
latency accuracy scores evaluated
referencing result helpful survive lead
reduction maintaining.
 

sufi

lala
wow, the second, longer one comes out radically shortened

CNN latency target platform conditional
input image early exit auxiliary classifier
OFA BPNet LUT-based estimation execution
time types layers shapes profiled stored
estimator arbitrary look-up CondNAS
construct efficiency mean LightGBM
model diverse labels prediction training
dataset generation reuse changed.
 

version

Well-known member
Here's another quote from Douglas Oliver that I think sheds a lot of light on what Prynne does (referring to Wound Response specifically, but you could apply it to a lot of his other work)

"Prynne's work would reach, if it could, beyond the language condition where sub-microscopic, bio-chemical events are mere metaphor for mental process to a condition where they become more closely a description of that process...

Linking a description of sub-microscopic events to mental events proposes an inner relation that is matched by an outward relation between human mental process and the external world it perceives, where the same sub-microscopic events determine process...

What Prynne's poetry has often sought to do is look at the sub-atomic, instead of the macrocosmic, and see if close analogies can be drawn between such events and the poetic act of mind.

This could get pretentious, but there is always a risk that poets will give up on the kind of task set for them by the great figures such as Dante. Particularly, the birth of the mind act is one of poetry's primary subject matters...

Providing it keeps its humility, poetry can try to reach beyond analogy towards a kind of visionary-literal."

Similar to this from one of those Gravity's Rainbow articles I posted.

... the inanimate world is made dynamically animate. A light bulb is given a lengthy biography. Pinball machines rewire themselves. A new polymer is sexualised. Fungus Pygmies sing acapella “on the other side of […] the whole bacteria-hydrocarbon-waste cycle”.

At the deepest level, molecules do not seek independence, but ever more complicated bonds and ties. Fusion, not the fission that detonates over Hiroshima, will out. Life insists on its becoming, even under the mushroom clouds.

[...]

The novel also marks a major moment of transition: from an aesthetic theory based on resemblance (analogy, metaphor, symbolism, etc.) to one predicated on the infinite fungibility of molecular matter and the omnipresence of electronic signals.
 

Benny Bunter

Well-known member
The novel also marks a major moment of transition: from an aesthetic theory based on resemblance (analogy, metaphor, symbolism, etc.) to one predicated on the infinite fungibility of molecular matter and the omnipresence of electronic signals.
Yeah, I can see how that last bit relates to it, striving to go beyond metaphor to pure description of the process of the act of the poetic mind.

I think this is what @luka was on about earlier - tracing information pathways and processes, but also performing them in an integrated, poetic way.

I'm not sure you can ever get completely beyond metaphor and symbolism into pure material description tbh. Maybe you can to a high degree with the elements within the poem, but I still think the poem as a whole ends up being figurative.

Oliver also says in the same essay that if you did manage to write something that was pure description of how the mind works you would solve the age-old body/mind question, which is probably unattainable, but he still commends Prynne for striving towards it.

But I'm probably chatting shit about stuff I don't understand now, I know next to nothing about philosophy.
 

Benny Bunter

Well-known member
Paraphrasing Oliver again, Wound Response starts with a bodily/mental wounding, and describes the bodies response to that wound, from the first biochemical responses, then to the brain signals that register as pain, and finally on to the actual physical bruising.

In parallel to the body's response, the poem describes the instant when the wounding occurs, and describes how it registers in the mind/consciousness and is finally committed to memory.
 

Benny Bunter

Well-known member
Haven't properly looked at any of the poems in that new Latency book yet, but what Luka said about them makes me think they're very much along the same line, although perhaps widening out from just descriptions of body/mind responses but to 'information flows' of all kinds.
 
Top