Arama Sonuçları - Go Fighting!

  • Gösterilen 1 - 4 sonuçlar arası kayıtlar. 4
Sonuçları Daraltın
  1. 1
  2. 2

    The Palgrave Handbook of Languages and Conflict

    Baskı/Yayın Bilgisi 2019
    Full-text access
    OPAC'ta görüntüle
    e-Kitap
  3. 3

    New narratives of disability : constructions, clashes, and controversies /

    Baskı/Yayın Bilgisi 2019
    Full-text access
    OPAC'ta görüntüle
    e-Kitap
  4. 4

    Deep Learning with Python, Second Edition. Yazar: Chollet, Francois

    Baskı/Yayın Bilgisi 2021
    İçindekiler: “…9.3.5 Putting it together: A mini Xception-like model -- 9.4 Interpreting what convnets learn -- 9.4.1 Visualizing intermediate activations -- 9.4.2 Visualizing convnet filters -- 9.4.3 Visualizing heatmaps of class activation -- Summary -- 10 Deep learning for timeseries -- 10.1 Different kinds of timeseries tasks -- 10.2 A temperature-forecasting example -- 10.2.1 Preparing the data -- 10.2.2 A common-sense, non-machine learning baseline -- 10.2.3 Let's try a basic machine learning model -- 10.2.4 Let's try a 1D convolutional model -- 10.2.5 A first recurrent baseline -- 10.3 Understanding recurrent neural networks -- 10.3.1 A recurrent layer in Keras -- 10.4 Advanced use of recurrent neural networks -- 10.4.1 Using recurrent dropout to fight overfitting -- 10.4.2 Stacking recurrent layers -- 10.4.3 Using bidirectional RNNs -- 10.4.4 Going even further -- Summary -- 11 Deep learning for text -- 11.1 Natural language processing: The bird's eye view -- 11.2 Preparing text data -- 11.2.1 Text standardization -- 11.2.2 Text splitting (tokenization) -- 11.2.3 Vocabulary indexing -- 11.2.4 Using the TextVectorization layer -- 11.3 Two approaches for representing groups of words: Sets and sequences -- 11.3.1 Preparing the IMDB movie reviews data -- 11.3.2 Processing words as a set: The bag-of-words approach -- 11.3.3 Processing words as a sequence: The sequence model approach -- 11.4 The Transformer architecture -- 11.4.1 Understanding self-attention -- 11.4.2 Multi-head attention -- 11.4.3 The Transformer encoder -- 11.4.4 When to use sequence models over bag-of-words models -- 11.5 Beyond text classification: Sequence-to-sequence learning -- 11.5.1 A machine translation example -- 11.5.2 Sequence-to-sequence learning with RNNs -- 11.5.3 Sequence-to-sequence learning with Transformer -- Summary -- 12 Generative deep learning -- 12.1 Text generation.…”
    Full-text access
    OPAC'ta görüntüle
    e-Kitap