Facebook needs us. It needed us at the beginning to create the "network effect": friends flocking together to see and be seen. But now it needs us more than ever, as Slate writes in a story titled "Who Controls Your Facebook Feed." As the company tries to get better at being the end-all be-all of curators of our lives and desired news, it's actually trying a few new things to get better at doing that, and to stop annoying us. From our more nuanced trackable reactions to actual people on panels and focus groups telling the company what they actually think, Facebook is trying to make their infamous algorithm more successful at predicting our wants and not depriving of things we actually would rather be seeing or reading. But the company's News Feed designers and engineers are the ones who ultimately have to sort this dilemma, as like a lot of Silicon Valley, they're not exactly a diverse group.

Shrouded in secrecy, the News Feed algorithm, or more accurately the series of algorithms, with their "hundreds" of inputs and careful balances, can make or break the popularity of a video or post — like the Slate story itself or this one from SFist. It can also make or break a user's experience with the site.

After all these years, "Facebook’s news feed algorithm is surprisingly inelegant, maddeningly mercurial, and stubbornly opaque," Slate notes. "It remains as likely as not to serve us posts we find trivial, irritating, misleading, or just plain boring. And Facebook knows it."

Remember when you had to go to a friend's page to see their doings and postings. Those were the horse-and-buggy days before things changed for good in 2006. Enter News Feed, with users' activity pushed to our home pages. The problem is that between the Like button, which at first we didn't know was helping the algorithm get more intelligent about us, and whatever else they're tracking about our usage of the site, the news feed may have reached a limit in its ability to guess our preferences. Of the thousands of possible posts and articles it could be showing us any given hour, especially if a user has a lot of friends, we may only ever see a hundred or so, and a recent test showed that many users were missing things they didn't want to be missing.

In 2014 came the company's first “feed quality panel,” headquartered in Knoxville, and the company asked users to respond to their satisfaction with News Feed, with written responses. "Within months," Slate writes, "[the news feed] team had grown so reliant on the panel’s feedback that they took it nationwide, paying a demographically representative sample of people around the country to rate and review their Facebook feeds on a daily basis from their own homes."

Of course, all this information goes back to the source — Facebook's mothership in Menlo Park. And there's where some see a potential problem in a kind of cultural bottleneck. Have a look.

The more we users give this team — who are clearly smart people — the better, for us and for Facebook. But can that group of eight young dudes possibly know how to speak for all of us?

For example, as Facebook scales back its reliance on simple systems like its strict real name policy in favor of more human hierarchies of checks and balances (a new appeals system with an oversight group is underway), or gauges our reactions in new ways (with, for example, a series of reactions buttons in lieu of the catch-all "like"), it's probably worth keeping an eye on who represents us and our views to Facebook itself. Sure, the company isn't a representative democracy. But if Facebook wants us to tell this team how to curate our life's news for us, maybe it should look more like one.

Related: Facebook Introduces Special Circumstances, Human Oversight To 'Real Names Policy' Enforcement