foot job definition

  • noun:
    • A sexual work where in actuality the genitalia tend to be stimulated by someone's foot.

Related Sources

  • Sentence for "foot job"
  • Urban Dictionary for "foot job"
29 votes

How would you define foot job?

All the definitions on AZdictionary were written by people just like you. Now's your chance to add your own!