Anyone know of any careers that revolve around educating and engaging employees at organizations? Specifically, around topics like diversity, inclusion, and social awareness? My partner would love to enter that field and could use advice on what direction to go in. Anything from job titles, to possible people to connect with to learn more, to potential career tracks to follow... thanks everyone!
Want to see the real deal?
More inside scoop? View in App
More inside scoop? View in App
blind
SUPPORT
FOLLOW US
DOWNLOAD THE APP:
FOLLOWING
Industries
Job Groups
- Software Engineering
- Product Management
- Information Technology
- Data Science & Analytics
- Management Consulting
- Hardware Engineering
- Design
- Sales
- Security
- Investment Banking & Sell Side
- Marketing
- Private Equity & Buy Side
- Corporate Finance
- Supply Chain
- Business Development
- Human Resources
- Operations
- Legal
- Admin
- Customer Service
- Communications
Return to Office
Work From Home
COVID-19
Layoffs
Investments & Money
Work Visa
Housing
Referrals
Job Openings
Startups
Office Life
Mental Health
HR Issues
Blockchain & Crypto
Fitness & Nutrition
Travel
Health Care & Insurance
Tax
Hobbies & Entertainment
Working Parents
Food & Dining
IPO
Side Jobs
Show more
SUPPORT
FOLLOW US
DOWNLOAD THE APP:
comments
At one company we had a “director of culture“ who maintained brand consistency but also morale and team/company events. Looking back it made a massive difference and that company was much healthier than a lot of the toxic scenes I’ve seen lately. That was from 2010-2015; unsure if companies care now. That guy now leads an international incubator which is also pretty cool.
I would chat up an hr in your “New” company to find out how they go about it, and perhaps have your good friends at other company try the same.