|
Yuval Pinter @🍺🍫 #emnlp2018
@
yuvalpi
Atlanta, GA
|
|
PhD student @gtcomputing doing #NLProc. Formerly @yahooresearch. 🇮🇱. Twice a father, twice ran a marathon, twice made tiramisu from scratch.
|
|
|
24,021
ಟ್ವೀಟ್ಗಳು
|
270
ಹಿಂಬಾಲಿಸುತ್ತಿರುವವರು
|
725
ಹಿಂಬಾಲಕರು
|
| ಟ್ವೀಟ್ಗಳು |
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
35 ಸೆಕೆಂ |
|
😪
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
2 ನಿಮಿ |
|
אל תתחיל אני עוד מתאושש מהלבובסקיאדה והחמישיאדה
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
8 ನಿಮಿ |
|
*סודאוף בולדריק
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
13 ನಿಮಿ |
|
סיידקיק משני בסדרה בריטית אלמותית
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
26 ನಿಮಿ |
|
(אין לי מושג אגב מה כתב בשרשור שמואל מאיר, הפוץ שאיכשהו בלק אותי אחרי שכבר היה אצלי על מיוט זמן רב. אבל אני בהחלט צודק פה)
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
29 ನಿಮಿ |
|
מה פתאום, שתיהן גדולות.
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
31 ನಿಮಿ |
|
ציפ אדריפה ימפמפוני
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
31 ನಿಮಿ |
|
הו שלום לכן קורות שער! מה אתן עושות פה? היה קשה המעבר?
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
41 ನಿಮಿ |
|
על מושב אחד?
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
47 ನಿಮಿ |
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
1 ಗಂ. |
|
Q: what about long-range dependencies, semantic tasks etc.?
Goldberg: "I don't think they can do it yet" [from a formal, provable perspective I guess? @yoavgo correct me if this wasn't what you meant]
#ThusEndeth
#BlackboxNLP #emnlp2018
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
1 ಗಂ. |
|
Tweeter's report, first large session: wifi isn't perfect but holding up bravely, considering. Big challenge coming up tomorrow in the main conference's first keynote #emnlp2018
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
1 ಗಂ. |
|
not gonna argue with you!
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
1 ಗಂ. |
|
Q (x2 actually): how does attention / transformer arch factor into this analysis?
Goldberg: we haven't tried it - great idea for future work
#BlackboxNLP #emnlp2018
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
1 ಗಂ. |
|
Q: humans also find certain context-free tasks harder than CS ones, is this comparable?
@yoavgo : I think humans process tasks differently, esp. with respect to "copy" behavior vs. "generalize" behavior.
#BlackboxNLP #emnlp2018
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
1 ಗಂ. |
|
Q (Ray Mooney): how can we use these findings to actually do better NLP?
Goldberg: knowing what our models are good at can help us design them for the right tasks; reasoning about simple rules within RNNs helps set expectations
#BlackboxNLP #emnlp2018
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
1 ಗಂ. |
|
Goldberg: on the other hand, neither SRNN nor GRU can count.
Empirically, LSTM learns a^nb^n much better than GRU.
(time to tag first author @gail_w )
#BlackboxNLP #emnlp2018 pic.twitter.com/PmWIlgnTYq
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
1 ಗಂ. |
|
Goldberg: LSTM can reduce to counter machines if the c_t update phase saturates both the forget gate and the input gate (both = 1)
IRNN can also produce that behavior.
#BlackboxNLP #emnlp2018 pic.twitter.com/jDHGjOqQJN
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
1 ಗಂ. |
|
Goldberg: *counter machines* form a weird family of languages - non-empty non-inclusive relationship with context-free and context-sensitive formal languages (yes a^nb^n, but no palindromes)
#BlackboxNLP #emnlp2018
|
||
|
|
||
|
Yuval Pinter @🍺🍫 #emnlp2018
@yuvalpi
|
2 ಗಂ. |
|
Goldberg: definition time. SRNN ("Saturate RNN") := the vanilla, Elman flavour that uses tanh activation; IRNN := same with ReLU. GRU, LSTM - "gated" RNNs.
CLAIM: the gated kinds are provably more expressive than the non-gated ones.
#BlackboxNLP #emnlp2018
|
||
|
|
||