After seeing the documentary, The Corporation, what do you think?
Instructions:
Is “The Corporation” still representative of corporate culture in 2020?
Are corporations really psychopathic?
Does treating corporations as people sound appropriate?
Do you think corporations should have rights?
What components of social justice do you think apply to corporations?
And what can business leaders gain from viewing this documentary?