Difference between Book-keeping and Accounting
Book-keeping and accounting are often used interchangeably, but they have distinct differences. Book-keeping involves recording transactions in a systematic manner, while accounting extends to analyzing data and communicating results to users. Book-keeping is clerical and routine, focusing on recording and balancing accounts, whereas accounting is analytical and involves interpreting financial information. Book-keeping is the initial stage of accounting, starting with transaction recording, while accounting encompasses a broader scope, including preparing summary statements and reports. Book-keeping is usually done by bookkeepers following rules and regulations, while accounting requires specialized knowledge and skills to ascertain a business's financial position.
Uploaded on Feb 24, 2025 | 1 Views
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
CSCI 5922 Neural Networks and Deep Learning: Representation Mike Mozer Department of Computer Science and Institute of Cognitive Science University of Colorado at Boulder
Representation Definition? a formalism for capturing certain key features of some other domain E.g., a square { ?,? , ?,? , ?, ? , ?, ? , ?,? } A representation makes certain domain information explicit (and easy to compute) and other information implicit (and difficult to compute)
Representing An English Word Localist or one hot encoding ABLE 1 0 0 0 0 0 0 0 0 0 0 0 0 AXLE 0 1 0 0 0 0 0 0 0 0 0 0 0 BOLT 0 0 0 1 0 0 0 0 0 0 0 0 0 GEAR 0 0 0 0 0 0 1 0 0 0 0 0 0 WEAR 0 0 0 0 0 0 0 0 0 1 0 0 0 Neutral representation ABLE each item orthogonal to each other no expected generalization from one item to another AXLE BOLT favored by end-to-end deep learning folks
Representing An English Word Distributed orthographic representation A1 1 B1 0 ... B2 1 ... X2 0 A3 0 L3 1 E4 1 R4 0 T4 0 ABLE 0 0 0 0 AXLE 1 0 0 0 0 1 0 0 1 0 1 0 0 BOLT 0 1 0 0 0 0 0 0 1 0 0 0 1 GEAR 0 0 0 0 0 0 1 0 0 0 0 1 0 WEAR 0 0 0 0 0 0 1 0 0 0 0 1 0 ABLE AXLE Captures similarity of spelling patterns natural generalization from word to similarly spelled words WEAR BOLT GEAR
Representing An English Word Distributed semantic representation ABLE 1 0 0 0 0 0 0 0 0 1 0 0 0 AXLE 0 1 1 0 0 0 0 1 0 0 0 1 0 BOLT 0 0 1 0 1 0 0 0 0 0 0 1 0 GEAR 0 1 1 0 1 0 0 0 0 0 0 1 1 WEAR 0 0 0 0 0 0 0 0 1 0 1 0 1 ABLE Captures similarity of meaning AXLE WEAR GEAR BOLT
Using Statistics of Natural Language To Discover Word Representations Useful For Other Tasks Neural Probabilistic Language Model (Bengio, Ducharme, Vincent, 2003) Word2Vec (Mikolov, Chen, Corrado, Dean, 2013) Uses unlabeled data to determine representations
Comparison of Representations Local Neuron activated for only one concept One neuron activated per concept Distributed Neuron activated for many concepts Many neurons activated per concept A1 B1 ... B2 ... X2 A3 L3 E4 R4 T4 ABLE 1 0 0 0 0 0 0 0 0 0 ABLE 1 0 0 1 0 0 0 0 1 0 1 0 0 AXLE 0 1 0 0 0 0 0 0 0 0 AXLE 1 0 0 0 0 1 0 0 1 0 1 0 0 BOLT 0 0 0 1 0 0 0 0 0 0 BOLT 0 1 0 0 0 0 0 0 1 0 0 0 1 GEAR 0 0 0 0 0 0 1 0 0 0 GEAR 0 0 0 0 0 0 1 0 0 0 0 1 0 WEAR 0 0 0 0 0 0 0 0 0 1 WEAR 0 0 0 0 0 0 1 0 0 0 0 1 0
Comparing Representations Local ABLE Distributed A/1, B/2, L/3, E/4 Semi-distributed or sparse AB, BL, LE
How Many Concepts Can Be Represented Simultaneously? Local all of them ABLE, GEAR, WEAN Distributed one A/1, G/1, B/2, E/2, A/3, L/3, E/4, R/4 binding problem AB, AR, BL, EA, GE, LE Semi-distributed depends on similarity AN, AR, EA, GE, WE
How Many Neurons Are Required? Local many one per word (~2000 four letter words) Distributed few one per letter per position (104) Semi-distributed intermediate one per letter pair (262 = 676)
Is Similarity Structure Captured? Local ABLE no: neutral representation ABLE AXLE Distributed BOLT AXLE yes: overlap, vector dot product, etc. Semi-distributed WEAR BOLT ABLE intermediate GEAR AXLE WEAR GEAR BOLT
Does Representation Support Generalization? Local 1 2 no ? ? easy to build arbitrary mappings AT AN AM IF IT IN Distributed yes 1 2 similar patterns similar consequences ? ? generalization of learning A/1 I/1 T/2 N/2 M/2 F/2
How Difficult Is It To Interpret Representation? Local FINK .41 LINK .45 FIND .64 easy Distributed F/1 .48 I/2 .92 N/3 .62 K/4 .23 L/1 .54 D/4 .95 hard
Representing New Concepts Local need to add neurons LINE LEFT LIFT Distributed need to specify a unique pattern of activation across the existing neurons L/1 I/2 N/3 E/4 E/2 F/3 T/4
Tensor Product Representation (Smolensky, 1990) position Formalism for representing role-filler pairs 1 2 3 4 5 6 7 1 0 0 0 0 0 0 E.g., filler: letter 0 0 0 0 0 0 0 A 0 0 0 0 0 0 0 0 B 0 role: position in word 1 0 0 0 0 0 0 C 1 Notation 0 0 0 0 0 0 0 D 0 letter 0 0 0 0 0 0 0 E 0 ?: role vector 0 0 0 0 0 0 0 F 0 ?: filler vector 0 0 0 0 0 0 0 X 0 ? ?: outer (tensor) product 0 0 0 0 0 0 0 Y 0 0 0 0 0 0 0 0 Z 0
Representing Multiple Role-Filler Pairs ? = ??? ?? position E.g., 1 2 3 4 5 6 7 0 1 0 0 0 0 0 A {C/1, A/2, B/3} 0 0 1 0 0 0 0 B 1 0 0 0 0 0 0 C 0 0 0 0 0 0 0 D letter 0 0 0 0 0 0 0 E 0 0 0 0 0 0 0 F 0 0 0 0 0 0 0 X 0 0 0 0 0 0 0 Y 0 0 0 0 0 0 0 Z
Distributed Roles Role vectors position 1 2 3 1 .5 1 .5 0 0 0 0 1 2 3 0 0 .5 1 .5 0 0 A 2 0 0 .5 1 .5 0 0 CAT 0 0 0 0 0 0 0 B 3 .5 1 .5 0 0 0 0 C 0 0 0 0 .5 1 .5 0 0 0 0 0 0 0 D What will this type of representation achieve? 1 2 3 .5 1 .5 0 0 0 0 A Words with the same letter in nearby positions will overlap in similarity ACT 0 0 0 0 0 0 0 B 0 0 .5 1 .5 0 0 C 0 0 0 0 0 0 0 D E.g., CAT and ACT
Similarity of ACT and CAT With localist role representation ?? With distributed (overlapping) role representation ??.?
Distributed Filler position E.g., vowels vs. consonants or phonetic similarity 1 2 3 A E B D 0 0 .25 .5 .25 0 0 A BET .5 1 .5 0 0 0 0 B A 1 .5 0 0 0 0 0 0 0 0 0 C B 0 0 1 .5 .25 .5 .25 0 0 0 0 D C 0 0 0 0 0 0 .5 1 .5 0 0 E D 0 0 .5 1 E .5 1 0 0 F 0 0 0 0 0 0 .5 1 .5 0 0 A BAT .5 1 .5 0 0 0 0 B X 0 0 0 0 0 0 0 0 0 0 0 C Y 0 0 0 0 .25 .5 .25 0 0 0 0 D Z 0 0 0 0 0 0 .25 .5 .25 0 0 E
Representing Magnitudes E.g., position along a line Spatial encoding also known as place or value encoding local representation Frequency encoding also known as variable encoding has properties of a distributed representation
Psychologically Grounded Representations Music composition network (Mozer, 1994) Task: predict next pitch in diatonic scale with one-hot pitch representation 54.5% accuracy with psychologically grounded representation 98.4% accuracy
Log Scaling Appropriate for capturing wide dynamic ranges in values e.g., time e.g., brightness Appropriate for capturing human representations e.g., psychophysical quantities Weber-Fechner law: ?? ?= ???????? e.g., churn prediction (length of time with carrier)