Skip to content

Latest commit

 

History

History
594 lines (490 loc) · 21.5 KB

questions.org

File metadata and controls

594 lines (490 loc) · 21.5 KB

Chapter 1: Anything From Almost Nothing

1.6 Multiple Arugments

There’s an expression in here:

(λx(λy.xy))(λz.a) 1

Which I would expect to look like:

(λx.(λy.xy))(λz.a) 1

Especially given that in earlier parts of this section, there is a dot in that location.

Why was the dot omitted?

Chapter 2: Basic Expressions and Functions

  • The syntax for indentation was introduced, but not for semicolons. What’s the deal with them?

Exercises with Let and where

I’m confused about what they’re asking from us. Where clauses are not expressions, so we can’t translate these directly without doing some sort of binding.

What did other people come up with?

Chapter 3: String

Types of concatenation functions

Prelude> :type (++) (++) :: [a] -> [a] -> [a] Prelude> :type concat concat :: Foldable t => t [a] -> [a]

But the book says:

concat :: [[a]] -> [a]

I guess the book does explain this.

Concatenation and scoping

  • Can we talk about “right associativity”?

Building functions

  • Lets talk about exercise 5. I want to see others implementations (pg 115)

Chapter 4: Basic Types

Let’s talk about indentation

Chapter 5: Types

  • “Thinking about types as being like sets will help guide your intuition on what types are and how they work in a mathematical sense” pg 145.
    • How?
  • pg 149 talks about the info of (->) can we break this down?
    • It’s a “type constructor for functions”
    • I think:
      • It’s infix, with low precedence
      • and right associative
  • pg 161 “Currying and uncurring functions of three or more arugments automatically is quite possible but trickier”
    • Want to try?
  • pg 161 - 164 – Maybe go through these quickly?
  • pg 166 “A subclass cannot override the methods of its superclass”
    • Really? Why?
  • pg 167 These are pretty cool questions
  • pg 175, question 3, What is the answer? I said (b)
  • pg 179, question 6, Let’s do this
  • pg 183 Principle types. Lets talk about what they are
  • What is the type of the empty list?
    • Like, is it [a]?
  • pg 185. Let’s talk about the difference between parametric and ad-hoc polymorphism. Especially in regards to these definitions.

Chapter 6: Typeclasses

Pages/questions from version 0.11.2

  • pg 188: Lets talk about the expression problem
  • pg 188: What does it mean for typclasses to “dispatch on type”
  • General: Talk about “Implements” vs “Has instances for”
  • pg 194: “Since Real cannot override the methods of Num, this typeclass inheritance is only additive and the ambiguity problems caused by multiple inheritance in some programming languages is avoided.”
  • pg 196: Why don’t we need to constrain by Fractional and Num?
  • General: What do the lines in :info Typeclass that look like:

    {-# MINIMAL compare | (<=) #-}

    mean?

  • pg 207: Talk about IO (), specifically, why main has to be IO ().
    • Also “An IO action that returns ()”
  • pg 210: “”

Pages/questions from version 0.12.0

  • pg 179: Can we walk through a couple of these?
    • Specifically 2, 5, 6
    • Very specifically: 5. I don’t think I got this one right.
  • pg 200: Can we unpack the language in the first few paragraphs of 6.12? Namely “instances”
  • I want to talk about dispatching on type as well.
  • pg 208
    • Exercise 1, can we make it typecheck?
    • Exercise 3.b. Can we walk through this error?
  • General: What’s with data definitions like: data MyThing = Thing Integer? Will we cover these later?
  • 209: Exercise 4. This totally typechecks. Talk about Data constructor partial application?
  • pg 210: Exercise 1. Why???? I would think you could be more general…
    • This seems like a trick based on something we haven’t learned yet, and less about typeclasses.

Chapter 7: More Functional Patterns

  • pg 223: In
    (\x -> x * 3) :: Integer -> Integer
        

    Are the two uses of (->) the same, on the left and right side of ::?

  • pg 224: Exercise 1. They produce the same effect, but the method of curring is different, right? Actually, I just tested this in the repl:
    -- using (++) makes it clearer
    (\x -> \y -> \z -> x ++ y ++ z)
    mTh x y z = x ++ y ++ z
        

    I guess it just binds the leftmost arguments first. Which I guess makes sense.

  • pg 225: Any and all data constructors!!!
  • pg 239: talk about explicit parenthetization of type signatures
  • pg 251: Question 5. I don’t understand
  • General: Is there such thing as exponent types??
  • pg 261: Question 1.a) Does divMod make this simpler? I don’t think I got the right answer.
  • pg 263: Question 6: lets talk about this

Chapter 8: Recursion

  • pg 277: Recursion is self-referntial composition. What does that mean?
  • pg: 280: “Bottom is used in haskell to refer to computations that do not successfully return a value.” So bottom is not itself a value? What does it mean to refer to a “computation” in this sense?
  • pg 292: Recursion question 2. We talk about partial functions being bad, but we’re writing them for exercises like this. Is that something we should be concerned about?
    • Or at least, it is “bottom” when given a negative argument, or floating point argument.

Chapter 9: Lists

  • pg 302: What is a spine? What is a cons cell?
  • pg 315: No really, what is a “spine”?
  • pg 317: “The length function is only strict in the spine”
    • What does this mean?
  • pg 317: “Values in Haskell get reduced to weak head normal form by default”
    • What?
  • pg 319: If a range isn’t evaulated, what is it? How does it know where to stop?
  • pg 341: Is there any difference between the builtin list and the custom definition other than syntax?

Chapter 10: Folding Lists

  • pg 346: “Also, each of them has a main function with a recursive pattern that associates to the right”. Can we map this out?
  • pg 347: can we walk through this definition, in light of right associativity.
  • pg 353: I don’t understand, given the definition of fold, how the examples with undefined do not end up evaluated
  • pg 355: Let’s go over left associativity. What is an example when you would want this?

Chapter 11: Algebraic datatypes

  • pg 386: Are there “kinds of kinds”? Sorts?
  • pg 386: what does it mean for a kind to be “fully applied”?
  • ** Are the only “kinds” fully applied vs waiting for application?
  • pg 388: Any uses for “phantom” types?
  • pg 390: “types resolve at compile time” no introspection capabilities at all?
  • ** pg 397: Can we talk about cardinality/types/sets here?
    • is (Int, Int) higher cardinality than (Int)?
    • In math it wouldn’t be
  • pg 398: what is that min/max bound syntax??
  • ** Can we make a list of what an algebra requires
    • identity function?
    • sum?
    • product?
  • pg 403: what would the definition with a type synonym look
  • When would you ever use type synonyms over newtype?
  • pg 404: exercise (1), what? Can we go over this?
    • Actually, I kinda get this, (Int, String) is a single type though?
  • pg 404: exercise (3) can we go over this?
  • pg 409: “the cardinality of a datatype roughly equates to how difficult it is to reason about” – discuss
  • ** Can we draw out all the ways to define new types? And talk about them?
  • ** pg 412: Why describe something as “Normal form” in this context?
  • pg 412: I don’t understand how this second Author definition works
  • pg 413: Did I get this answer right?
  • pg 423: Exercise Programmers I used list comprehensions, what did others use?
  • pg 428: Exponential == enumerating all possible implementations of a function
  • pg 444: We could write foldr for binary trees

Chapter 12: Signalling Adversity

  • pg 459: I don’t understand the functor thing. Can we talk about what it means “Functor will not map over the left type argument because it has been applied away”?
  • pg 464: Lifted vs unlifted. What is an example of an unlifted type in haskell?
  • pg 471: determine the kinds (1) and (2), what are the answers?
  • pg 478: Is my better iterate better?

Chapter 13: Building Projects

  • Can you use stack offline? Like, if I already have a version of haskell installed, does it really need to download something?

Chapter 14: Testing

  • pg 534: “If we had not asserted the type of x in the property test, the compiler would not have known what concrete type to use”
    • Why does the compiler need to know this?
  • pg 535: “When you use the arbitrary value, you have to specify the type to dispatch the right typeclass instance, as types and typeclasses instance form unique pairings”
    • What does this mean?
  • pg 536: How the fuck does this arbitrary typecasting shit work???
    • It does say it’s MAGIC, but like, wow
  • pg 552: Can we talk about the types of properties and whatnot?
  • pg 554: Can we go through these Arbitrary typeclass implementations
  • pg 561: How do you organize these tests?

Chapter 15: Monoid, Semigroup

  • Does it make sense to defined a “best” monoid instance for the actual type, and newtypes for the less good ones?
  • pg: 585: Let’s go over the solution for the Optional monoid
  • An example of a binary operator without an identity:
    • first, last
    • More?
  • pg 592: What was the point of “Madness” at all???
  • pg 593: How does this quickcheck thing work?
  • pg 601: Couldn’t you write a Monoid for NonEmpty:
instance Monoid a => Monoid (NonEmpty a) where
  mempty = mempty :| []
  mappend (d1 :| xs) (d2 :| ys) = (mappend  d1 d2) :| (xs ++ ys)
  • pg 606 How do we have a newtype with two type variables? I thought that wasn’t possible.

Chapter 16: Functors

  • pg 617: “querying the kind of the function type constructor” holy shit! Is -> just a regular type constructor??
  • pg 620: Why is T capitalized here?
  • Mention that the idea of “kind inference” is pretty cool (see pg 620)
  • pg 621: This parallel between function application and fmaps is neat! And the mnemonic for the similarity between $ and <$>
  • General I’m just not sure what “lifting” means. Is there a mnemonic for remembering/understanding this?
    • “lifting a function”, like lifting it into the structure?
  • Is it possible for a functor implementation to violate the composition law without violating the identity law?
    • Assume we have violated composition…
  • ?? What’s the deal with fmapping over functions?
    • use question 3 on pg 641 as an example
  • pg 632: Can we go over how (fmap . fmap) works?
  • pg 646: What is the difference between the two quickChecks here:
main :: IO ()
main = do
  quickCheck $ property (\x -> (x :: Int) + 1 == x + 1)
  quickCheck (\x -> x + 1 == (x :: Int) + 1)

They seem to work the same?

  • General Is there a shorter way of writing:
blah :: (Num a, Num b) => a -> b
  • pg 652: Ahhh, here’s the reason why you can’t apply the function to Left! It’s not just convention. (See exercise “Short Exercise 2” solution.)
  • pg 658: We’re using RankNTypes language extention. I don’t need to know exactly what it does, or how it works, but I’m wondering if we know what the “Rank” of a type is.
  • Chapter Exercises
    • “Quant”: Why was this so hard to Hspec test? Can we look at my tests.
    • Can we work through how the Flip implementation works?

    -

Chapter 17: Applicative

  • pg 671: Why would you ever use liftA over fmap?
  • pg 672: “But the increase in power it introduces is profound.” Should this be obvious?
  • pg 676: “use :i on (,)” This is nuts, (,) is just a regular old data constructor.
    • I think it’s very interesting how few “primitive” data structures there are in haskell.
  • pg 684: first two lines are cool examples of how this could be useful.
  • pg 685, question 4. What’s the intended value for sum. Should it be summing the elements of the tuple? Is it not because tuple implements foldable in a weird way?
  • pg 701-706: ?? Can we write something that violates each of these laws? Identity, Composition, Homomorphism, Interchange. Specifically Homomorphism and Interchange.
  • pg 709: I’m not sure I got anything out of this section. How does this all work? What is it all for?
  • pg 711: How does this arbitrary stuff work? I don’t follow the fmapping, but I can make the following do something:
:l src/zipListMonoid

sample (arbitrary :: Gen (ZipList Int))
  • pg 712: I’m really proud of my Arbitrary instance for List'
  • pg 720: Why does the following fail interchange:
instance Applicative Pair where
  pure x = Pair x x
  (<*>) (Pair f g) (Pair x y) = Pair (f x) (f y)
  • pg 704: Why does the following fail all the stuff?
instance Monoid a => Applicative (Two a) where
  pure = Two mempty
  (<*>) (Two i g) (Two x y) = Two i (g y)

Chapter 18: Monads

  • pg 724: fmap f xs = xs >>= return . f I’d like to work out how this works.
  • How does this work?
blah :: [Int]
blah = do
  x <- [1, 2, 3]
  y <- [2, 4, 5]
  [x, y]
  • Why are these different:
blah' :: [Int]
blah' = do
  x <- [1, 2, 3]
  y <- [2, 4, 5]
  (\x -> [x, x]) x

blah'' :: [Int]
blah'' = do
  x <- [1, 2, 3]
  (\x -> [x, x]) x
  • *** pg 765: It talks about creating structure and joining it together. How does that work? This is re-occuring throughout the chapter.

Answer:

“Generating structure” here refers to the second argument of >>=. If you have a function already, which returns stuff in a monad, you need a special operation to combine this with an existing monad. That’s what >>= gives you.

  • pg 765: Why is =<< introduced here for the first time?
  • pg 765: Exercise 3. Can we go over this?
  • pg 765: Exercise 4. I did it with do syntax, and bind. I really like how my do syntax answer came out, but can we make my bind answer better?

*

Chapter 19: Applying Structure

  • pg 777 Here, and earlier, we see the use of functions as having Monoid instances, Applicative instances, functor instances. Can we talk about what it means to fmap over functions, etc?

Chapter 20: Foldable

  • Is there a better way to write the elem lambda?
  • Is there a better way to do the minimum / maximum other than guards?

Chapter 21: Traversable

  • pg 813: Why does Traversable require a Functor t but traverse requires a Applicative f
  • General: Can we review the Applicative/Monad dependency?
  • pg 820: (x .) . y as a general pattern for getting a new funciton that takes two arguments?
  • Really, why applicative? And not functor?
  • pg 830: How to quickcheck S (n a) a
  • Why do we need any of the constraints here at all?
  • pg 830: QuickChecking Tree

Chapter 22: Reader

  • pg 835: The “structure” is partially applied functions? does it only work for partially applied functions? Why?
  • ** Can we go through the functor/applicative/monad laws for the behavior of functions instances of these typeclasses?
  • Is it clarifying to say “Partially applied functions have a Functor instance”? Like, partially applied Either values have a Functor instance? Or partially applied Constant values?
  • “The applicative an monad instances give us a way to map a function

that is awaiting an a over another function that is also awaiting an a.” How can we see that from the type signature of <*>?

(<*>) :: Applicative f => f (a -> b) -> f a -> f b

Is f a above “a function awaiting an a”? Does that make sense given the Applicative instance for functions is for ((->) r)?

Given that interpretation, it would make sense that the first argument is a function awaiting an (a -> b).

Then, how is (+) :: (Num a => a -> a -> a) awaiting an a -> b? Isn’t (->) right associative? Like (Num a => a -> (a -> a)), not left associative like (Num a => (a -> a) -> a). Still, the following works:

λ> (+) <*> (* 2) $ 3
9
  • There is so much crazy stuff in section 22.3 for such a short section. Can we break it down/talk about it?
  • Is there any way of destructuring functions?
  • Maybe partially applied is the wrong way of describing it. Partially specialized?
  • pg 848. Write liftA2 yourself. Did I do this right?
  • pg 849. Write the applicative for reader. Did I get it right?
  • pg 855 What does this ReaderT thing do at all??

Chapter 23 State

  • pg 870: The results of all of my calls to
    evalState rollDieThreeTimes' (mkStdGen 0)
        

    result in tuples that start with DieSix, no matter what value I place in the 0. Is this expected?

    Answer: Actually, it turns out it’s only DieSix in certain ranges. See:

    length $ filter (== DieSix) (map (\n -> evalState rollDie (mkStdGen n)) [1..10000])
        

    vs

    length $ filter (== DieSix) (map (\n -> evalState rollDie (mkStdGen n)) [100000..200000])
        
  • pg 873:

Should the applicative be defined as:

instance Applicative (Moi s) where
  -- pure :: a -> Moi s a
  pure a = Moi (\s -> (a, s))

  -- (<*>) :: Moi s (a -> b) -> Moi s a -> Moi s b
  (Moi f) <*> (Moi g) = Moi applied
    where applied s =
            let (val, state) = g s
                f' = fst $ f s
            in (f' val, state)

or

instance Applicative (Moi s) where
  -- pure :: a -> Moi s a
  pure a = Moi (\s -> (a, s))

  -- (<*>) :: Moi s (a -> b) -> Moi s a -> Moi s b
  (Moi f) <*> (Moi g) = Moi applied
    where applied s =
            let (val, state) = g s
                f' = fst $ f state
            in (f' val, state)

That is, How should the state be passed through the process of getting the function out of Moi f, and into applying Moi g?

Chapter 24 Parser Combinators

  • pg 891: Exercises 2 and 3 use functions we haven’t been introduced to. I don’t know how to solve these exercises.
  • pg 894-895: How does parseFraction work? Can we talk about its types?
    • namely, the statement: “The final result has to be a parser so we embed our integral value in the Parser type by using return”
    • What does that imply about the types of numerator and denominator?
  • pg 904: I vaguely understand what some vs many are. But can we unpack this mutual recursion?

Chapter 25 Composing Types

  • pg 958: “Type constructors can take other type constructs as arguments too”, just as functions can take other functions as arguments. This is what allows us to compose types.
    • This is a neat statement. I wonder what the implications are.
  • pg 955: I don’t think I understand what “closed under composition” really means (or implies?). I’d like to revist this in the context of knowing that monads are not closed under composition.
  • pg 956: I got the answer to this exercise, but I don’t really understand how it works. Can we walk through it
  • pg 992: I’m starting to get how monad transformers work. It feels like a hack though. What if I wanted and IO reader, rather than the other way around? Am I missing some generality? How do haskellers decide what should be a transformer and what should be a regular monad?

Chapter 26 Monad Transformers

  • pg 980: I understand how the MaybeT transformer is constructed now, but why does it matter?
  • pg 981: I’m trying to quickcheck all this, and writing instances for Eq, EqProp, and Show are really hard for EitherT.
  • pg 981: Should eitherT be defined in terms of Monad operations? Or case statements? Or both?
  • pg 983, how does the double fmap here work? Is it at all like:
    λ> (fmap . fmap) (+1) Just $ 5
        
  • pg 984: “This is one of the most used monad transformers” Then why no examples of use?
  • pg 990: I don’t get this at all. Can we break it down.
  • pg 993: “Speaking generally, it is about lifting actions in some Monad over a transformer type which wraps itself in the original Monad.”
    • what the hell?
    • what is an “action” here
    • “which wraps itself” -> what is this referring to? The monad, or the transformer, or the action?

Chapter 27 Non-Strictness

  • pg 1034: How does something like the following work:
    • let wc x z = let y = bot `seq` 'y' in y `seq` x

like this:

wc x z =
  let y = bot `seq` 'y'
  in y `seq` x
  • pg 1044: I’m not sure how to expand these. Especially the simple ones. I haven’t seen an example of what you want.
  • pg 1044: What about call by reference, is this like call by name? What does it mean to “only evaluate once”? What are some examples of each of these things? I feel like I don’t have a grasp on some of these concepts.
  • pg 1046 - This example, with GHCi version 8.0.1 does not give a fully evaluated myList = [1, 2, 3], instead it give myList = _
  • pg 1051 - I don’t see the sharing behavior in my version of GHCi (8.0.1). I see the trace happening twice in the addition example.
  • pg 1052 - It’s very unclear to be what the effect of explicit vs implicit parameters does to sharing. It seems like there are contradictory statements on the page and the examples don’t seem to relate to the statements.
  • pg 1055. At the end of the example at the top of the page, there’s a reference to bla which has not been defined. And eval'd which I haven’t seen before.
  • pg 1065: “At times we’d also like to evaluate the contents to weak head normal form just like with functions.” When? Why? This is entirely unmotivated.
  • Why doesn’t this chapter talk about thunks, or the performance and memory costs of stictness/non-strictness more. I don’t care how the tools to turn it on and off work if I don’t know how to recognize when I would want to. I realize this is a big topic, but I don’t have even a small sense of the practical applications of this after reading this chapter.