Brain supplements?

Yeah, but I don’t like coffee, and I can’t find really inexpensive cappuccino that’s artificially sweetened.
It’s okay, I’ll just stick to caffeine pills. :+1:
By the way, I tried to “thank” your post, but I ran out of thanks for today! :frowning_with_open_mouth:

9600MG per capsule is like 1,000,000 lumens AA flashlight. It looks completely sus, I would not take it. If you already received it I would throw it in the trash.

There are a few companies that are good:

  • Pure Encapsulations
  • Life Extension
  • Thorne
  • Sports Research
  • NOW
  • Country Life
  • Blue Bonnet
  • Nootropics Depot

I may have forgotten some. There have been reports of counterfeit supplements on Amazon, and even if you did find something from one of these brands, I would try to find it somewhere else.

As for Neuriva, I looked it up and its looks like a bunch of stuff. PS hasn’t really panned out for anything.

2 Thanks

The 9700mg per capsule is deliberately misleading.
It’s “the equivalent” of 9700mg, but it’s only actually 1000mg per capsule.
The supplement has very good feedback on Amazon, so I’m going to try it. :slightly_smiling_face:

Well, good luck … I pay for lab results for these things and its a !@#$show. Hope it doesn’t fry your noodle or liver.

It does that to ants IRL!

2 Thanks

According to Fakespot the reviews are most likely not legit:

1 Thank

I like Lion’s Mane powder with a coffre. Really good for focus. But the brain gets used to Lion’s mane very fast (~4 Days) so I only take it once a while. If you are too much tired, it’s not gonna do magic though.

Edit: I don’t mix the powder with coffee… :slight_smile:

1 Thank

I don’t bother with Fakespot anymore.
Their ratings are mostly useless.
I find ReviewMeta to be much more useful, and following this link, the Amazon reviews look pretty legit (in my opinion.)
Of course, interpreting ReviewMeta’s data is in the eye of the beholder, and others may not come to the same conclusion that I have. :slightly_smiling_face:

1 Thank

Based on comparing this single product between websites, it seems like FakeSpot is less forgiving while offering fewer details while ReviewMeta is more forgiving but offers more details.

I notice that on this specific product, ReviewMeta shows that half of the most-trusted reviews are negative while all of the least trusted reviews are positive. Based on this, I am highly suspicious of the product.

In the future I may use ReviewMeta as a second opinion, and to check some of the stats that FakeSpot doesn’t show.

1 Thank

I’ve been using Fakespot and ReviewMeta for years, and I have had them analyze hundreds of different products.
I think ReviewMeta does a better job, and that’s the general consensus online if you do some Google searches on the subject.
It does help to have some experience (with ReviewMeta) to interpret ReviewMeta’s data, though.

1 Thank

I’ve just been testing some products I know on Fakespot and one or two got clobbered with an undeserved bad rating. I think Fakespot might be overly focused on detecting moved reviews which can happen if a listing is rebuilt (I think). A rebuilt listing can be an indicator of fraud but it isn’t always.

I tried googling Fakespot vs ReviewMeta just now and the first result is a thread where someone says neither works properly anymore.

This might be true, I looked more deeply at the reviews that ReviewMeta is calling untrustworthy and I’m thinking there are a lot of false-positives happening. :confused:

1 Thank

:+1:
Yeah, I find actually reading Amazon reviews is even more useful than relying on Fakespot/ReviewMeta, though I do sometimes rely on ReviewMeta.

I go to the 4s first when reading reviews. Next thing I look at is the percentage of 1s. When the 1s are above 5 or 6 % I am cautious. This is assuming the review count is sizable.

I always figure the 2s and 3s are most useful for finding potential problem areas. These are people who are too even-handed to immediately drop a 1 star review but critical enough to write down what’s bothering them. The 4s are good too but they ultimately liked it so they might not point out dealbreakers.

Sample 1 star
“It’s awful, I hate it.”

Sample 5 star
“It’s awesome, I love it.”

Sample 2, 3 star
"Here is a list of reasons why I’m returning this product: "

1 Thank
4 Thanks

It doesn’t matter if 100% of the reviews are legit. Self reporting the effects of things like vitamins, medications, nootropics is “anecdata” or “bro science”, and very low quality evidence because it doesn’t account for uncontrolled variables, subjectivity, placebo, or nocebo effect, among other things.

Yes, I realize that, of course, but on the other hand, I have had very good experiences by choosing supplements based on Amazon reviews.
Also, saying to just buy certain brands, and not to buy supplements from Amazon, is not helpful.
There are plenty of good brands that aren’t on your list, and plenty of good supplements on Amazon.
I do know that you’re trying to be helpful, and I am grateful for that, but I get great results my way and I’m not going to do things your way.

You wanna help your brain, drink half glass of pomegranate juice. Regularly.

But is you want a completely legal power boost, large Starbucks on an empty stomach in the morning.

It all comes down to blood flow.
I strongly believe most of supplements are scam.

Save your money. There is no clinical or scientific lab evidence that the brain supplement products work. If you look at/listen to Neuriva advertising, their advertising claims don’t even explicitly say anything about improving memory, concentration, etc., because they are trying to skirt the advertising laws and stay just out of trouble. They talk about improving or adding to “indicators of brain health.” That’s just vague nutrient claims for general health that are allowed by the US government. But they know people will interpret their ad claims AS IF they are making claims about memory, concentration, etc.

Prevagen - they make explicit claims about memory, concentration, etc., but the study they are referencing in their ads was done by their own scientists working for their own company, and they had to throw out much of the data and analyze only subsets of the total results to get results they could claim show a statistically significant difference. In reality, if you look at all the data, there was no evidence that the product supports their ad claims.

1 Thank