Introduction to Theory of Constraints

Motivation and procrastination is hard to help with. I explicitly made it out of scope for my async tutoring, which is self-paced.

One tip is to try self-help books, but I know you already tried that.

Something I can do is make philosophy stuff and maybe some people will like it and be inspired or motivated. If some of my philosophy stuff seems particularly fun or interesting to you, you can say that (that’s worthwhile feedback for me), and you could try focusing on those topics more.

I would too. I would first pay Elliot for philosophy work such that he can do as much philosophy as he wants to.

A possible misaligned incentive would be to give more to posters who agree more with you. If the posters felt pressured to agree for money that would compromise their integrity and make them worse off in respect to philosophy. It would have to be primarily based on how much effort you put in. Getting good at lots of uncontroversial prerequisite skills, being persistent in debate, doing rigorous analysis and being honest would all be good metrics to base your support on. At the same time I don’t think you should support your ideological enemies, so there would be a balance.

I think It would also have to be pretty much secret in order not to attract people because of the money. The interest has to come for philosophy itself.

But I wouldn’t give to the charities that Hank’s little brother participates in. I could give to him if he was grateful and was trying to get off on his own feet.

Oh ok I think i have in mind doing stuff like this. Like I think about the benefit of others and dont stop to think of myself(e.g. when others ask me to do an activity with them). It leaves me with some kind of dissatisfaction not cuz of them but because i didnt stop to think how good is doing the activity for myself.

That’s a good question. Hank had so much, and he already did give a lot. He housed his family and was I remember trying to keep the economy alive helping others securing their jobs and keep making money.

It looks like Hank did have more than he needed like he was rich and well off. On the one hand, I say he shouldnt be an a-hole(sorry for my language) and should share his wealth. On the other hand, I say why should he want to share his wealth? Like why does he want to do it?

As I understand it (I haven’t studied this history) art patronage (was there patronage for philosophers/intellectuals as well?) was more popular before modern times. I think there was a mixture of commissions and just supporting the artist and letting him create what he wanted freely. I think letting the artist create freely is good. I think I would let Elliot work freely and then offer extra for specific philosophy work.

What would I spend on if I was rich? I would pay for convenience. I would pay to save time. I would pay for private tutoring. I would buy lots of art from Bryan Larsen and I would pay Roark to architect my house. I would put more money into new productive endeavors. But maybe a greater value than all those would be high quality discussions with peers. Supporting CF learners would be an investment to have great discussions later. So it would be in quite direct self-interest. I could also potentially hire someone who got really good at TOC for example.

But let’s say I was too busy to have lots of discussions, I would still value it. Knowing there’s a sanctuary for the best of the species and that I’m supporting it would be a value to me. It would make me happy. Likewise I would give to CF if I was dying. It’s an abstract spiritual value. It doesn’t have to be of material self-interest to be in my interest because ultimately it’s the spiritual values that matters.

Yes, I would that but for coaching

Edit: mb, i didnt mean to send this draft yet

I see you’re uncertain, but if you take this side for a moment. And let’s say the help he gives is very unlikely to be in his self-interest. Instead of giving to potential world class philosophers he would give to those who are most wretched. Would you say Hank isn’t merely a jerk if he doesn’t give to them, but that he has a moral obligation/duty to do so, and is evil if he doesn’t?

I’m not sure whether being jerk is just a degree of evil or not.

Dangit I didnt mean to post the replying to ET stuff yet that was just a draft i was making

just edit the post to say oops i didn’t mean to send this draft yet (and remove all the other text)

1 Like

Consider this: would you relieve a 22 year old Ayn Rand from her waitress work if you could? I would absolutely. Would it be in my self-interest? I think so.

I wouldnt say that. It sounds bad to give money to people who are most wretched.

Is “them” referring to the most wretched right? If it refers to potential world class philosophers, then idk. I would like to know why Hank wouldn’t.

Yes. But you might have the wrong idea of what wretched means. Here’s the New Ox definition I used:

(of a person) in a very unhappy or unfortunate state: I felt so wretched because I thought I might never see you again.

Btw there could be overlap, like Mallory would be arguably among the most wretched and a potential world class philosopher. But exclude the potential world class philosophers like Mallory.

1 Like

You’re right I thought it meant immoral or evil.

Ahhhh ok I think I could answer your question again.

Idk now cuz I wanna know why are they so wretched? Like will the money even help? I would say Rearden is evil if I thought people’s skills and achievements were determined from birth and there’s no improving.

A post was merged into an existing topic: [Dface] Discussion of Introduction to Theory of Constraints

Hello i wanna try using the Y/N philosophy decision chart to interact with the article:
The questions below are ideas that get evaluated as yes or no. The problem that decided the questions as yes is: Does this ask about something I didnt know about?

Is there an article talking about silver bullets and how to make one?

Is there an article about how to go about silver bullets from toc?

Does CR advocate a certain kind of gradualism?

Silver bullets sound cool. Is doing focusing steps a part of silver bullet solutions? After doing focusing steps irl the task i was doing gets way easier to do in a way.

Context: I’ve read The Goal, but not analyzed it

The Goal

  • two main parts of TOC are goals and focus
  • goals that involve a process of ongoing improvement
  • aim at throughput, i.e., success at our goal
    • to succeed at our goal we need to focus on the things that actually improves towards the goal, i.e., we need to focus on global optima
      • global optima improvements improve on parts that hold back other parts of achieving the goal. those parts are called bottlenecks (constraints, limiting factors)
        • there’s usually one one bottleneck like there’s only one weakest link in a chain, and the strength of the chain as a whole depends only on the weakest link
    • because most improvements are optimizing local optima and are either marginally beneficial, useless or even counterproductive towards our big vision goal

throughput: the whole process of achieving a goal including getting the end. Does that definition of “throughput” work?

In general, there’s only one bottleneck. This is like a metal chain: a chain has only one weakest link.

How do we know it’s like this in general? Why does the chain analogy apply in general? Why not a bunch of things going wrong at about an equal amount?

My guess is that we have just observed this to be the case.

I guess it’s not just in a technical way because there’s rarely two factors that are exactly equally weak. But rather that there’s usually one factor that’s weaker by a substantial margin? Like the weakest link is at 10% while the next weakest link is at 37% rather than 12%. I’m thinking that a situation with one 10% and another at 12% would be a situation with two bottlenecks.

Even if a bunch of things were going wrong, usually one is going the most wrong. so focus on that one and then move on to the other things which were going wrong. maybe we can’t improve the other things until we fix the very weakest link. Maybe fixing the weakest link makes fixing the other weak spots faster.

Throughput is the amount of goal success, e.g. producing 500 widgets/day.

1 Like

Almost 3 years ago.

Focusing Steps

Also don’t let the non-constraints produce more than the constraint can process.

If you do let that happen then you’ll build up some useless inventory that might be clogging up space. Maybe you’ll have to throw away something you spent time, energy and capital on producing.

Excess Capacity or Balanced Plants

  • variance and statistical fluctuations
  • a balanced plant aims at 100% utilization and 0% idle time
    • (simplified) every workstation would output the same amount
      • some stations would purposely do less than they could to fit in
  • assembly lines means there are dependencies, i.e. station C gets its parts from B and B gets its parts from A
    • A produces more than B: we get extra parts that clutter things up
    • A produces less than B: B has wasted potential, it can only produce as much as A gives it. A is the weakest link and the chain can’t get stronger if we don’t strengthen A
      • to avoid this we can create a buffer in front of B such that it can work when A doesn’t produce enough
      • we can also increase the production capacity of A but stop it when it produced everything that B needs
  • buffers help some stations keep up when the stations they depend on are underproducing
    • bigger buffers give more safety but take up more resources so it’s a tradeoff
    • feeding stations have to halt work when the buffer is full
  • some stations can have excess capacity such that it will rarely produce less than what is needed for the next station
    • it will have to limit its output most of the time in order to not build up excessive buffers everywhere
    • this is an unbalanced plant. some stations have more capacity than others
    • relying solely on excess capacity and not using buffers carries risk because A can have really bad day or be completely broken
      • with more excess capacity you need less buffer and vice versa
        • there’s a tradeoff and we mostly decide based on how expensive each are
  • a balanced plant which has a bad day will on average not replenish its buffer
    • A should have more capacity than B uses. it doesn’t have to be a lot more. the buffer can be gradually replenished if the average rate of A is higher than
  • buffers deal with variance and excess capacity replenish the buffer
  • if C has bad luck extra inventory can pile up ahead of it
    • in order to reliably get rid of this inventory C must have excess capacity in order to get rid of it
  • the amount of excess capacity doesn’t matter much, C will catch up pretty quickly anyways
  • everything has variance
    • you can reduce variance but not get to 0
  • design the plant around the bottleneck
    • there’s always a bottleneck so don’t try to avoid it, don’t try to build a balanced plant
    • multiple stations tied for least capacity only makes the system chaotic due to variance
    • the bottleneck should usually be the most expensive thing and since we need lots of excess capacity they should be cheap

Which workstations need buffers? Only the bottleneck.

why?

Bottlenecks are the ones that hold the rest back. If some station early on in the line underproduces then it can cause all the rest of the stations to be underutilized. But when it reaches the bottleneck it can just start using from its buffer and from there on normal production can continue.

The buffer is there to make up for other stations underproducing.

And which workstations need excess capacity? Every non-bottleneck.

because if one station in the chain has less or same capacity then that station can become the bottleneck. the stations after this station can’t produce more than it produces. this could either cause the original bottleneck to stagnate on its buffer size or use up the buffer.

A plant may have multiple bottlenecks if it has multiple production lines. It can also have a more complicated production line that isn’t a linear chain. E.g. there could be three workstations that make parts which feed into B, then B combines all those incoming parts, and then the output of B feeds into multiple later workstations. There could also be multiple B workstations, and they could have different production characteristics (e.g. one uses a fancy new machine, another uses an older machine, and a third uses hand tools). These details complicate the analysis, but the principles and conclusions remain similar.

  • analyze more complicated scenarios like these

This is related to decision making in general. Usually there are lots of factors that are easy to get plenty of, and only zero, one or a few factors which are hard to get enough of. With factories or life in general, if there are many factors which are hard to get enough of, what you’re doing may be too difficult, and you should give some consideration to changing approaches.

  • Think of things this can apply to

related thoughts

The most important thing to optimize in life is what to optimize. I.e., to optimize your focus. Focus on what matters. Keep in mind what your ultimate goals in life are. Don’t spend too much resources on minor goals or sub-goals.

Having a bunch of money is a bad ultimate goal in life. Money doesn’t have intrinsic value. If you don’t use it for anything then it’s no good. Having money is a sub-goal for other ends. But making a bunch money, the act of producing, could be a great goal.

TOC Idea Explanations

Effect-Cause-Effect: we want to know the cause of a problem. We guess a cause and check whether it could work by considering which other effects that cause would also produce. We predict those effect then test them against reality. If the predictions don’t match with reality then there’s a problem without our proposed change. Either the cause can’t be true, we didn’t understand the effects of the cause or something else interfered and caused a different effect to happen.

Have people estimate the time needed for a 50% chance to finish their task on time, not a 90% chance. A 90% or higher chance of finishing on time is what people use when they’re adding a margin of error to the time estimate for their individual task,

So we need less chance for a big task or the whole project because it has a higher chance of good and bad luck to balance out. The individual tasks more often have big variances, so we have to account for huge setbacks. Whereas big compound tasks have some individual tasks that had really bad luck, but also some other tasks with good luck to balance out. So if on the whole there was bad luck it usually won’t be as extreme.

and a 50% chance of finishing on time means no margin of error for that task.

I don’t understand why there wouldn’t be any margin of error. Wouldn’t no margin of error mean something where you think you have 1% chance of finishing on time, which is when everything goes perfect?

For production, put safety buffers at bottlenecks, not at each individual step.

For inventory management, put buffers

Where would you put buffers for a software development project? Maybe for some really key library functions that will help the rest of the development? Maybe that’s too small scale?

Where would you put buffers for a grammar project? Maybe just the overall time and energy you think the whole project should take?

Critical Fallibilism has something to add about buffers. They’re connected with the concept of error correction.

I don’t understand what CF or error correction adds here. Isn’t spare resources already a part of buffers?

A drummer synchronizes many people like in a marching band. Another example is using a drum so that many rowers on a boat can pull their oars in sync.

That kind of sounds like a balanced plant because every station would output the same.

But I think the point is to minimize variance and make a sign to correct errors according to. If someone fell behind in the marching, and let’s say he’s blind too, he wouldn’t continue in the original pace, he would speed up for the next beat in order to match the drum.

That’s like a station that fell behind having excess capacity to make up for what it underproduced.

If everyone marched at maximum marching speed then when someone fell slightly behind due to variance they would never catch up (without running instead of marching).

Synchronization as in a balanced plant is a bad idea and actually gets unsynchronized because of variation. But synchronization is desired and can work when we have one bottleneck and excess capacity elsewhere.

A rope can tie hikers together so they don’t spread out. In the hike in The Goal, they put the slowest kid in front and said no one is allowed to pass him, which achieved a similar result without tying any kids with ropes. Similarly, don’t release raw materials onto the factory floor, to be worked on, faster than the constraint (the slow guy or bottleneck work station) can keep up with.

The rope restricts. It tells you to halt production. The drum tells you to speed up, the rope tells you to slow down.

Because a chain is an analogy for dependencies:

(This applies when there are dependencies – when you’re dealing with a chain. If there were actually a bunch of unrelated things, then it wouldn’t be a chain with a weakest link, and more like 20% of stuff would be important.)

And from Optimize Limiting Factors

If a factory has run for a while, and built stuff, then most factors involved in production have excess capacity. But if you’re building a new factory and haven’t started yet, then you may have many limiting factors. You need walls and all the tools and people and everything. Without adding all those things, the factory won’t work. This can be seen in Goldratt’s chain analogy. A chain has one weakest link. But that assumes the chain is actually built and functional. If you haven’t made the chain yet, then every link is a limiting factor – they all need to be made before the chain will work. So it’s important to differentiate between functional systems (most factors have excess capacity) and systems that haven’t functioned yet (they may have many limiting factors before they will work).