Restricting crawls

Can I please ask someone to explain the difference between

  • Restrict to start domain(s)
  • Restrict to sub-path(s)

when I am adding a list of urls to crawl… I’d like to just stay on the sites and not follow links off them to twitter etc…
Is “Restrict to start domain(s)” the right way to go about that?
Or is it “sub-paths”

Thanks